Nov 25 12:07:58 crc systemd[1]: Starting Kubernetes Kubelet... Nov 25 12:07:58 crc restorecon[4692]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:58 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 12:07:59 crc restorecon[4692]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 25 12:07:59 crc restorecon[4692]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 25 12:08:00 crc kubenswrapper[4693]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 12:08:00 crc kubenswrapper[4693]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 25 12:08:00 crc kubenswrapper[4693]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 12:08:00 crc kubenswrapper[4693]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 12:08:00 crc kubenswrapper[4693]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 25 12:08:00 crc kubenswrapper[4693]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.537318 4693 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546175 4693 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546207 4693 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546221 4693 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546231 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546241 4693 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546250 4693 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546259 4693 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546269 4693 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546278 4693 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546286 4693 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546294 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546302 4693 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546309 4693 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546317 4693 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546325 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546332 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546341 4693 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546349 4693 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546357 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546364 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546411 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546421 4693 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546429 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546437 4693 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546445 4693 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546452 4693 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546460 4693 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546467 4693 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546475 4693 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546483 4693 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546490 4693 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546498 4693 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546506 4693 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546516 4693 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546523 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546531 4693 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546538 4693 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546546 4693 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546554 4693 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546562 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546570 4693 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546577 4693 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546585 4693 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546593 4693 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546601 4693 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546608 4693 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546620 4693 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546630 4693 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546640 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546650 4693 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546658 4693 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546667 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546676 4693 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546687 4693 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546697 4693 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546706 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546714 4693 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546721 4693 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546729 4693 feature_gate.go:330] unrecognized feature gate: Example Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546737 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546746 4693 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546754 4693 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546762 4693 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546769 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546780 4693 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546788 4693 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546798 4693 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546808 4693 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546815 4693 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546823 4693 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.546831 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547038 4693 flags.go:64] FLAG: --address="0.0.0.0" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547061 4693 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547081 4693 flags.go:64] FLAG: --anonymous-auth="true" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547095 4693 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547111 4693 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547123 4693 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547135 4693 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547147 4693 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547157 4693 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547167 4693 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547177 4693 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547186 4693 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547195 4693 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547204 4693 flags.go:64] FLAG: --cgroup-root="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547213 4693 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547223 4693 flags.go:64] FLAG: --client-ca-file="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547232 4693 flags.go:64] FLAG: --cloud-config="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547240 4693 flags.go:64] FLAG: --cloud-provider="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547249 4693 flags.go:64] FLAG: --cluster-dns="[]" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547259 4693 flags.go:64] FLAG: --cluster-domain="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547269 4693 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547278 4693 flags.go:64] FLAG: --config-dir="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547287 4693 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547297 4693 flags.go:64] FLAG: --container-log-max-files="5" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547308 4693 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547317 4693 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547327 4693 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547336 4693 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547346 4693 flags.go:64] FLAG: --contention-profiling="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547355 4693 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547364 4693 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547403 4693 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547413 4693 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547423 4693 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547434 4693 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547446 4693 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547457 4693 flags.go:64] FLAG: --enable-load-reader="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547488 4693 flags.go:64] FLAG: --enable-server="true" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547500 4693 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547514 4693 flags.go:64] FLAG: --event-burst="100" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547524 4693 flags.go:64] FLAG: --event-qps="50" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547534 4693 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547543 4693 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547552 4693 flags.go:64] FLAG: --eviction-hard="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547564 4693 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547573 4693 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547583 4693 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547592 4693 flags.go:64] FLAG: --eviction-soft="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547601 4693 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547610 4693 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547619 4693 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547628 4693 flags.go:64] FLAG: --experimental-mounter-path="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547637 4693 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547646 4693 flags.go:64] FLAG: --fail-swap-on="true" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547656 4693 flags.go:64] FLAG: --feature-gates="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547667 4693 flags.go:64] FLAG: --file-check-frequency="20s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547676 4693 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547686 4693 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547697 4693 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547707 4693 flags.go:64] FLAG: --healthz-port="10248" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547716 4693 flags.go:64] FLAG: --help="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547725 4693 flags.go:64] FLAG: --hostname-override="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547734 4693 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547744 4693 flags.go:64] FLAG: --http-check-frequency="20s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547754 4693 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547764 4693 flags.go:64] FLAG: --image-credential-provider-config="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547773 4693 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547783 4693 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547793 4693 flags.go:64] FLAG: --image-service-endpoint="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547801 4693 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547810 4693 flags.go:64] FLAG: --kube-api-burst="100" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547819 4693 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547829 4693 flags.go:64] FLAG: --kube-api-qps="50" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547838 4693 flags.go:64] FLAG: --kube-reserved="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547847 4693 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547855 4693 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547865 4693 flags.go:64] FLAG: --kubelet-cgroups="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547874 4693 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547883 4693 flags.go:64] FLAG: --lock-file="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547892 4693 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547902 4693 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547912 4693 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547936 4693 flags.go:64] FLAG: --log-json-split-stream="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547945 4693 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547954 4693 flags.go:64] FLAG: --log-text-split-stream="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547963 4693 flags.go:64] FLAG: --logging-format="text" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547971 4693 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547981 4693 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.547991 4693 flags.go:64] FLAG: --manifest-url="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548000 4693 flags.go:64] FLAG: --manifest-url-header="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548012 4693 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548021 4693 flags.go:64] FLAG: --max-open-files="1000000" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548032 4693 flags.go:64] FLAG: --max-pods="110" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548041 4693 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548050 4693 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548059 4693 flags.go:64] FLAG: --memory-manager-policy="None" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548069 4693 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548078 4693 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548109 4693 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548119 4693 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548140 4693 flags.go:64] FLAG: --node-status-max-images="50" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548149 4693 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548158 4693 flags.go:64] FLAG: --oom-score-adj="-999" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548173 4693 flags.go:64] FLAG: --pod-cidr="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548182 4693 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548194 4693 flags.go:64] FLAG: --pod-manifest-path="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548203 4693 flags.go:64] FLAG: --pod-max-pids="-1" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548212 4693 flags.go:64] FLAG: --pods-per-core="0" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548221 4693 flags.go:64] FLAG: --port="10250" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548231 4693 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548239 4693 flags.go:64] FLAG: --provider-id="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548248 4693 flags.go:64] FLAG: --qos-reserved="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548257 4693 flags.go:64] FLAG: --read-only-port="10255" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548266 4693 flags.go:64] FLAG: --register-node="true" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548275 4693 flags.go:64] FLAG: --register-schedulable="true" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548283 4693 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548299 4693 flags.go:64] FLAG: --registry-burst="10" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548308 4693 flags.go:64] FLAG: --registry-qps="5" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548317 4693 flags.go:64] FLAG: --reserved-cpus="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548326 4693 flags.go:64] FLAG: --reserved-memory="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548337 4693 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548346 4693 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548356 4693 flags.go:64] FLAG: --rotate-certificates="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548364 4693 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548398 4693 flags.go:64] FLAG: --runonce="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548407 4693 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548416 4693 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548426 4693 flags.go:64] FLAG: --seccomp-default="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548435 4693 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548444 4693 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548454 4693 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548463 4693 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548472 4693 flags.go:64] FLAG: --storage-driver-password="root" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548481 4693 flags.go:64] FLAG: --storage-driver-secure="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548490 4693 flags.go:64] FLAG: --storage-driver-table="stats" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548502 4693 flags.go:64] FLAG: --storage-driver-user="root" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548510 4693 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548520 4693 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548529 4693 flags.go:64] FLAG: --system-cgroups="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548538 4693 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548552 4693 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548561 4693 flags.go:64] FLAG: --tls-cert-file="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548570 4693 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548581 4693 flags.go:64] FLAG: --tls-min-version="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548589 4693 flags.go:64] FLAG: --tls-private-key-file="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548598 4693 flags.go:64] FLAG: --topology-manager-policy="none" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548607 4693 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548616 4693 flags.go:64] FLAG: --topology-manager-scope="container" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548625 4693 flags.go:64] FLAG: --v="2" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548637 4693 flags.go:64] FLAG: --version="false" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548649 4693 flags.go:64] FLAG: --vmodule="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548660 4693 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.548671 4693 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.548903 4693 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.548914 4693 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.548923 4693 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.548932 4693 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.548940 4693 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.548949 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.548958 4693 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.548966 4693 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.548975 4693 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.548983 4693 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.548991 4693 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.548998 4693 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549006 4693 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549015 4693 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549026 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549033 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549041 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549052 4693 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549062 4693 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549070 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549224 4693 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549232 4693 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549240 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549251 4693 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549260 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549267 4693 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549275 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549283 4693 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549291 4693 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549298 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549306 4693 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549314 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549322 4693 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549330 4693 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549338 4693 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549346 4693 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549354 4693 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549362 4693 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549395 4693 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549404 4693 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549412 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549420 4693 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549427 4693 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549435 4693 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549443 4693 feature_gate.go:330] unrecognized feature gate: Example Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549450 4693 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549461 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549469 4693 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549476 4693 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549484 4693 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549492 4693 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549499 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549507 4693 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549517 4693 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549527 4693 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549535 4693 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549543 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549550 4693 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549558 4693 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549566 4693 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549574 4693 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549582 4693 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549589 4693 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549597 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549607 4693 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549617 4693 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549625 4693 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549634 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549644 4693 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549653 4693 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.549661 4693 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.549688 4693 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.562201 4693 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.562288 4693 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562474 4693 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562493 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562500 4693 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562507 4693 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562513 4693 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562519 4693 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562524 4693 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562530 4693 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562535 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562543 4693 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562548 4693 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562553 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562559 4693 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562564 4693 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562571 4693 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562580 4693 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562587 4693 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562593 4693 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562599 4693 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562606 4693 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562612 4693 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562618 4693 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562624 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562630 4693 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562635 4693 feature_gate.go:330] unrecognized feature gate: Example Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562644 4693 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562649 4693 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562655 4693 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562660 4693 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562666 4693 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562671 4693 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562676 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562684 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562690 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562699 4693 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562705 4693 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562712 4693 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562720 4693 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562727 4693 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562733 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562739 4693 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562744 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562750 4693 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562755 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562761 4693 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562766 4693 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562772 4693 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562777 4693 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562783 4693 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562788 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562794 4693 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562799 4693 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562805 4693 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562810 4693 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562816 4693 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562821 4693 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562828 4693 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562835 4693 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562841 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562848 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562854 4693 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562860 4693 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562866 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562871 4693 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562877 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562883 4693 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562889 4693 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562894 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562900 4693 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562905 4693 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.562912 4693 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.562921 4693 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563112 4693 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563125 4693 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563133 4693 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563141 4693 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563147 4693 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563153 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563159 4693 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563165 4693 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563171 4693 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563178 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563183 4693 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563189 4693 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563194 4693 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563200 4693 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563206 4693 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563211 4693 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563216 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563222 4693 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563227 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563232 4693 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563238 4693 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563243 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563249 4693 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563254 4693 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563260 4693 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563266 4693 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563272 4693 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563278 4693 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563283 4693 feature_gate.go:330] unrecognized feature gate: Example Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563289 4693 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563294 4693 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563300 4693 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563305 4693 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563310 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563317 4693 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563323 4693 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563328 4693 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563333 4693 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563339 4693 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563345 4693 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563351 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563356 4693 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563362 4693 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563367 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563396 4693 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563402 4693 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563409 4693 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563416 4693 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563423 4693 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563430 4693 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563436 4693 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563442 4693 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563450 4693 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563457 4693 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563465 4693 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563472 4693 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563479 4693 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563487 4693 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563494 4693 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563500 4693 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563506 4693 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563512 4693 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563518 4693 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563524 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563530 4693 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563537 4693 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563544 4693 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563550 4693 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563556 4693 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563562 4693 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.563570 4693 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.563579 4693 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.564889 4693 server.go:940] "Client rotation is on, will bootstrap in background" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.569697 4693 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.569808 4693 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.571512 4693 server.go:997] "Starting client certificate rotation" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.571540 4693 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.571857 4693 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-07 07:36:10.872158897 +0000 UTC Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.571993 4693 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 283h28m10.300171115s for next certificate rotation Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.633071 4693 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.636476 4693 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.653587 4693 log.go:25] "Validated CRI v1 runtime API" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.692015 4693 log.go:25] "Validated CRI v1 image API" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.695430 4693 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.704831 4693 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-25-12-03-00-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.704877 4693 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.723514 4693 manager.go:217] Machine: {Timestamp:2025-11-25 12:08:00.720443019 +0000 UTC m=+0.638528420 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc BootID:5664b910-f808-4ca5-913a-47b9ca069334 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:cb:07:ec Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:cb:07:ec Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d2:b6:ec Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2d:97:92 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:79:67:51 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:bf:9e:03 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:a2:a8:23:8f:d4:07 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3e:e7:70:52:0a:e8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.723876 4693 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.724163 4693 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.724897 4693 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.725246 4693 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.725317 4693 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.725746 4693 topology_manager.go:138] "Creating topology manager with none policy" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.725768 4693 container_manager_linux.go:303] "Creating device plugin manager" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.726364 4693 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.726443 4693 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.727313 4693 state_mem.go:36] "Initialized new in-memory state store" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.727532 4693 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.731618 4693 kubelet.go:418] "Attempting to sync node with API server" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.731713 4693 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.731754 4693 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.731777 4693 kubelet.go:324] "Adding apiserver pod source" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.731796 4693 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.736732 4693 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.738098 4693 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.739488 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.739507 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:00 crc kubenswrapper[4693]: E1125 12:08:00.739629 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 25 12:08:00 crc kubenswrapper[4693]: E1125 12:08:00.739635 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.742399 4693 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.744042 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.744083 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.744098 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.744111 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.744133 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.744146 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.744159 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.744180 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.744196 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.744213 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.744233 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.744246 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.745453 4693 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.746124 4693 server.go:1280] "Started kubelet" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.746463 4693 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.747246 4693 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.747429 4693 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.748115 4693 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 25 12:08:00 crc systemd[1]: Started Kubernetes Kubelet. Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.755062 4693 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.755142 4693 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 25 12:08:00 crc kubenswrapper[4693]: E1125 12:08:00.755585 4693 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.755637 4693 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 09:57:34.08923659 +0000 UTC Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.755740 4693 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 813h49m33.333505046s for next certificate rotation Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.756720 4693 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.756769 4693 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.757332 4693 factory.go:55] Registering systemd factory Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.757411 4693 factory.go:221] Registration of the systemd container factory successfully Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.757570 4693 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.758219 4693 factory.go:153] Registering CRI-O factory Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.758269 4693 factory.go:221] Registration of the crio container factory successfully Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.758344 4693 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.758404 4693 factory.go:103] Registering Raw factory Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.758424 4693 manager.go:1196] Started watching for new ooms in manager Nov 25 12:08:00 crc kubenswrapper[4693]: E1125 12:08:00.757323 4693 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187b3e9324e98c3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 12:08:00.746081341 +0000 UTC m=+0.664166762,LastTimestamp:2025-11-25 12:08:00.746081341 +0000 UTC m=+0.664166762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.761874 4693 manager.go:319] Starting recovery of all containers Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.763033 4693 server.go:460] "Adding debug handlers to kubelet server" Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.763931 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:00 crc kubenswrapper[4693]: E1125 12:08:00.763985 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="200ms" Nov 25 12:08:00 crc kubenswrapper[4693]: E1125 12:08:00.766182 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.775314 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.775874 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.775904 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.775926 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.775946 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.775968 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.775986 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776006 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776028 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776048 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776068 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776089 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776109 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776132 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776150 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776168 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776192 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776212 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776231 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776249 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776268 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776288 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776307 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776326 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776345 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776363 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776417 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776437 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776456 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776475 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776495 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776516 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776537 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776556 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776575 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776593 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776614 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776633 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776655 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776676 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776697 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776718 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776739 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776761 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776780 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776798 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776821 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776843 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776864 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776884 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776902 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776921 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776945 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776966 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.776991 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777020 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777049 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777074 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777094 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777115 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777134 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777153 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777175 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777194 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777222 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777248 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777272 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777293 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777313 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777331 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777352 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777401 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777420 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777445 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777464 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777483 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777501 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777521 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777541 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777592 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777615 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777634 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777652 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777670 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777695 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777714 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777735 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777754 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777776 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777797 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777816 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777834 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777853 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777872 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777891 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777910 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777932 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777951 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777970 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.777991 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778011 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778031 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778049 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778067 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778096 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778118 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778139 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778159 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778180 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778200 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778219 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778242 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778263 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778283 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778300 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778319 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778344 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778363 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778408 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778426 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778444 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778465 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778483 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778503 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778522 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778540 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778558 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778576 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778596 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778615 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778633 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778650 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.778670 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782074 4693 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782130 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782154 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782176 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782196 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782218 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782236 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782255 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782273 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782292 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782314 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782334 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782353 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782400 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782419 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782438 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782457 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782479 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782497 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782515 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782535 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782553 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782572 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782591 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782628 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782648 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782666 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782689 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782710 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782728 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782747 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782766 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782784 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782805 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782824 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782843 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782865 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782883 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782902 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782923 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782941 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782959 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.782979 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783000 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783019 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783038 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783059 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783078 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783098 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783117 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783136 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783153 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783175 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783194 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783213 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783234 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783252 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783271 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783292 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783313 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783331 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783350 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783483 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783508 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783526 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783545 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783567 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783585 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783604 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783624 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783642 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783660 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783679 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783697 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783715 4693 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783767 4693 reconstruct.go:97] "Volume reconstruction finished" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.783782 4693 reconciler.go:26] "Reconciler: start to sync state" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.800601 4693 manager.go:324] Recovery completed Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.809114 4693 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.811218 4693 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.811365 4693 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.811499 4693 kubelet.go:2335] "Starting kubelet main sync loop" Nov 25 12:08:00 crc kubenswrapper[4693]: E1125 12:08:00.811626 4693 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 25 12:08:00 crc kubenswrapper[4693]: W1125 12:08:00.815720 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:00 crc kubenswrapper[4693]: E1125 12:08:00.815824 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.816223 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.817873 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.817918 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.817931 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.818852 4693 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.818884 4693 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.818918 4693 state_mem.go:36] "Initialized new in-memory state store" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.830064 4693 policy_none.go:49] "None policy: Start" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.832423 4693 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.832484 4693 state_mem.go:35] "Initializing new in-memory state store" Nov 25 12:08:00 crc kubenswrapper[4693]: E1125 12:08:00.856894 4693 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.891550 4693 manager.go:334] "Starting Device Plugin manager" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.891636 4693 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.891656 4693 server.go:79] "Starting device plugin registration server" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.892241 4693 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.892312 4693 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.892538 4693 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.892800 4693 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.892823 4693 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 25 12:08:00 crc kubenswrapper[4693]: E1125 12:08:00.902020 4693 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.912308 4693 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.912542 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.913978 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.914018 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.914031 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.914167 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.914601 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.914683 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.914960 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.915002 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.915010 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.915168 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.915569 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.915734 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.916060 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.916078 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.916086 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.916156 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.916293 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.916339 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.916297 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.916401 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.916411 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.917070 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.917089 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.917099 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.917152 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.917239 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.917292 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.917255 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.917318 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.917327 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.917600 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.917767 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.917797 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.918589 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.918655 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.918678 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.918808 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.918830 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.918839 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.918990 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.919025 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.919821 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.919875 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.919895 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:00 crc kubenswrapper[4693]: E1125 12:08:00.964844 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="400ms" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.986624 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.986704 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.986732 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.986838 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.986891 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.986919 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.987007 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.987104 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.987142 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.987171 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.987205 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.987243 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.987271 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.987302 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.987349 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.992466 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.994233 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.994266 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.994275 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:00 crc kubenswrapper[4693]: I1125 12:08:00.994305 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 12:08:00 crc kubenswrapper[4693]: E1125 12:08:00.994907 4693 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088516 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088559 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088578 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088643 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088662 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088684 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088700 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088715 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088731 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088747 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088763 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088778 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088793 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088838 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088853 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.088964 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089009 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089038 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089099 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089157 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089125 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089090 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089203 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089175 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089239 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089236 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089240 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089169 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089243 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.089318 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.195475 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.197424 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.197495 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.197514 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.197560 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 12:08:01 crc kubenswrapper[4693]: E1125 12:08:01.198399 4693 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.265642 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.291525 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.298046 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.312987 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: W1125 12:08:01.315483 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-8208d960793bc773a5d88fdd79248dc9d42053527d04bc8fcc1d9839217c59f7 WatchSource:0}: Error finding container 8208d960793bc773a5d88fdd79248dc9d42053527d04bc8fcc1d9839217c59f7: Status 404 returned error can't find the container with id 8208d960793bc773a5d88fdd79248dc9d42053527d04bc8fcc1d9839217c59f7 Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.320017 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:01 crc kubenswrapper[4693]: W1125 12:08:01.326598 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-d9ce040ee36b68550c18d053d9fa9cbb172678069f8231ca815ed7b95eb431b2 WatchSource:0}: Error finding container d9ce040ee36b68550c18d053d9fa9cbb172678069f8231ca815ed7b95eb431b2: Status 404 returned error can't find the container with id d9ce040ee36b68550c18d053d9fa9cbb172678069f8231ca815ed7b95eb431b2 Nov 25 12:08:01 crc kubenswrapper[4693]: W1125 12:08:01.337088 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-c83c86e9c042b551582598a8f131512206a46cb8a33d7fdb400a8688dc8e0663 WatchSource:0}: Error finding container c83c86e9c042b551582598a8f131512206a46cb8a33d7fdb400a8688dc8e0663: Status 404 returned error can't find the container with id c83c86e9c042b551582598a8f131512206a46cb8a33d7fdb400a8688dc8e0663 Nov 25 12:08:01 crc kubenswrapper[4693]: W1125 12:08:01.341917 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-fc5e9161f459870a357f34ab34207d7b6730ac8d2bf0ec061985017adb59e5c3 WatchSource:0}: Error finding container fc5e9161f459870a357f34ab34207d7b6730ac8d2bf0ec061985017adb59e5c3: Status 404 returned error can't find the container with id fc5e9161f459870a357f34ab34207d7b6730ac8d2bf0ec061985017adb59e5c3 Nov 25 12:08:01 crc kubenswrapper[4693]: W1125 12:08:01.347331 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-60aae0a8325e10848f3863738f6085da6fd6ab8c7a5fae7ed59d9ddb02d0b521 WatchSource:0}: Error finding container 60aae0a8325e10848f3863738f6085da6fd6ab8c7a5fae7ed59d9ddb02d0b521: Status 404 returned error can't find the container with id 60aae0a8325e10848f3863738f6085da6fd6ab8c7a5fae7ed59d9ddb02d0b521 Nov 25 12:08:01 crc kubenswrapper[4693]: E1125 12:08:01.366414 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="800ms" Nov 25 12:08:01 crc kubenswrapper[4693]: W1125 12:08:01.577850 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:01 crc kubenswrapper[4693]: E1125 12:08:01.577970 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.599097 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.601092 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.601144 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.601173 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.601205 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 12:08:01 crc kubenswrapper[4693]: E1125 12:08:01.601784 4693 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Nov 25 12:08:01 crc kubenswrapper[4693]: W1125 12:08:01.649879 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:01 crc kubenswrapper[4693]: E1125 12:08:01.649985 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.747578 4693 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.820414 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fc5e9161f459870a357f34ab34207d7b6730ac8d2bf0ec061985017adb59e5c3"} Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.821774 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c83c86e9c042b551582598a8f131512206a46cb8a33d7fdb400a8688dc8e0663"} Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.823027 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d9ce040ee36b68550c18d053d9fa9cbb172678069f8231ca815ed7b95eb431b2"} Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.824209 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8208d960793bc773a5d88fdd79248dc9d42053527d04bc8fcc1d9839217c59f7"} Nov 25 12:08:01 crc kubenswrapper[4693]: I1125 12:08:01.825315 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"60aae0a8325e10848f3863738f6085da6fd6ab8c7a5fae7ed59d9ddb02d0b521"} Nov 25 12:08:01 crc kubenswrapper[4693]: W1125 12:08:01.943492 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:01 crc kubenswrapper[4693]: E1125 12:08:01.943761 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 25 12:08:01 crc kubenswrapper[4693]: W1125 12:08:01.974847 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:01 crc kubenswrapper[4693]: E1125 12:08:01.974953 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 25 12:08:02 crc kubenswrapper[4693]: E1125 12:08:02.168299 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="1.6s" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.402321 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.404907 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.405013 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.405034 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.405108 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 12:08:02 crc kubenswrapper[4693]: E1125 12:08:02.406199 4693 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.748208 4693 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.833083 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe"} Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.833151 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d"} Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.833165 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.833172 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3"} Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.833332 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9"} Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.834535 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.834605 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.834633 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.834714 4693 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c" exitCode=0 Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.834811 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c"} Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.834947 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.835840 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.835901 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.835926 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.836551 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce" exitCode=0 Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.836603 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.836704 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce"} Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.837536 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.837574 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.837587 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.839257 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.839359 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.839275 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e24d8e1e01435c7c5b42c961b1af89427f44a2863caa657addbe185b3b729776"} Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.839252 4693 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e24d8e1e01435c7c5b42c961b1af89427f44a2863caa657addbe185b3b729776" exitCode=0 Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.840691 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.840715 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.840760 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.840765 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.840775 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.840792 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.841788 4693 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e" exitCode=0 Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.841829 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e"} Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.841913 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.843097 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.843164 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:02 crc kubenswrapper[4693]: I1125 12:08:02.843193 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.393999 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.748045 4693 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:03 crc kubenswrapper[4693]: W1125 12:08:03.766193 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:03 crc kubenswrapper[4693]: E1125 12:08:03.766301 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 25 12:08:03 crc kubenswrapper[4693]: E1125 12:08:03.768863 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="3.2s" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.854030 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f"} Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.854098 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7"} Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.854120 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f"} Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.854141 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c"} Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.856200 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"04331f8bd66a506fb2e02c8472f0690d285e6ffb3320d0bd83c751f2bf175472"} Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.856335 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.857342 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.857387 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.857400 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.859990 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"08df000c8fd24ff58f995a96bc5bf8e665130996de15cf6d139575dcb8284002"} Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.860032 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a7e72b10217a3f81425d1f0243df0f4a40ab73aba0e06403d000bfacf0b6a6ac"} Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.860052 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"aad6fba70273e809f50e6a66fbb6fa507e315cef0b0c2b0fb6c635e306928d64"} Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.860108 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.861295 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.861327 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.861339 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.861781 4693 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7" exitCode=0 Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.861875 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.861909 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.861875 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7"} Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.862752 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.862836 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.862855 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.863362 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.863455 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.863469 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.929114 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:03 crc kubenswrapper[4693]: I1125 12:08:03.933870 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:03 crc kubenswrapper[4693]: W1125 12:08:03.999339 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 25 12:08:03 crc kubenswrapper[4693]: E1125 12:08:03.999473 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.006767 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.008696 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.008759 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.008773 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.008809 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 12:08:04 crc kubenswrapper[4693]: E1125 12:08:04.009386 4693 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.872206 4693 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b" exitCode=0 Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.872289 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b"} Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.872400 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.873581 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.873649 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.873669 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.877139 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162"} Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.877187 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.877275 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.877453 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.877542 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.877471 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.880142 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.880165 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.880201 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.880224 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.880202 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.880267 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.880161 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.880230 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.880343 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.880406 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.880423 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:04 crc kubenswrapper[4693]: I1125 12:08:04.880434 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.704981 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.885486 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4"} Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.885547 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574"} Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.885563 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6"} Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.885575 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368"} Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.885588 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.885604 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.885671 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.885693 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.886957 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.887002 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.887015 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.887165 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.887224 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.887242 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.887168 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.887305 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:05 crc kubenswrapper[4693]: I1125 12:08:05.887316 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:06 crc kubenswrapper[4693]: I1125 12:08:06.447414 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:06 crc kubenswrapper[4693]: I1125 12:08:06.894811 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f"} Nov 25 12:08:06 crc kubenswrapper[4693]: I1125 12:08:06.894935 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:06 crc kubenswrapper[4693]: I1125 12:08:06.894935 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:06 crc kubenswrapper[4693]: I1125 12:08:06.896913 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:06 crc kubenswrapper[4693]: I1125 12:08:06.896972 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:06 crc kubenswrapper[4693]: I1125 12:08:06.896996 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:06 crc kubenswrapper[4693]: I1125 12:08:06.897065 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:06 crc kubenswrapper[4693]: I1125 12:08:06.897097 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:06 crc kubenswrapper[4693]: I1125 12:08:06.897111 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:06 crc kubenswrapper[4693]: I1125 12:08:06.915101 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.210530 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.213017 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.213085 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.213102 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.213147 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.897450 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.897608 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.898361 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.898423 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.898438 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.899076 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.899143 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.899157 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.905180 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.905451 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.906802 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.906829 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:07 crc kubenswrapper[4693]: I1125 12:08:07.906839 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:08 crc kubenswrapper[4693]: I1125 12:08:08.318761 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 25 12:08:08 crc kubenswrapper[4693]: I1125 12:08:08.900871 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:08 crc kubenswrapper[4693]: I1125 12:08:08.900964 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:08 crc kubenswrapper[4693]: I1125 12:08:08.902485 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:08 crc kubenswrapper[4693]: I1125 12:08:08.902530 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:08 crc kubenswrapper[4693]: I1125 12:08:08.902547 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:08 crc kubenswrapper[4693]: I1125 12:08:08.902663 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:08 crc kubenswrapper[4693]: I1125 12:08:08.902717 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:08 crc kubenswrapper[4693]: I1125 12:08:08.902743 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:10 crc kubenswrapper[4693]: I1125 12:08:10.026084 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:10 crc kubenswrapper[4693]: I1125 12:08:10.026597 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:10 crc kubenswrapper[4693]: I1125 12:08:10.028202 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:10 crc kubenswrapper[4693]: I1125 12:08:10.028255 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:10 crc kubenswrapper[4693]: I1125 12:08:10.028266 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:10 crc kubenswrapper[4693]: E1125 12:08:10.902137 4693 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 25 12:08:13 crc kubenswrapper[4693]: I1125 12:08:13.026509 4693 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 25 12:08:13 crc kubenswrapper[4693]: I1125 12:08:13.026602 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 12:08:13 crc kubenswrapper[4693]: I1125 12:08:13.942767 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:13 crc kubenswrapper[4693]: I1125 12:08:13.942910 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:13 crc kubenswrapper[4693]: I1125 12:08:13.944070 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:13 crc kubenswrapper[4693]: I1125 12:08:13.944148 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:13 crc kubenswrapper[4693]: I1125 12:08:13.944174 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:14 crc kubenswrapper[4693]: I1125 12:08:14.161823 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 25 12:08:14 crc kubenswrapper[4693]: I1125 12:08:14.162033 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:14 crc kubenswrapper[4693]: I1125 12:08:14.163159 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:14 crc kubenswrapper[4693]: I1125 12:08:14.163202 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:14 crc kubenswrapper[4693]: I1125 12:08:14.163216 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:14 crc kubenswrapper[4693]: I1125 12:08:14.188931 4693 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 25 12:08:14 crc kubenswrapper[4693]: I1125 12:08:14.189055 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 25 12:08:14 crc kubenswrapper[4693]: W1125 12:08:14.723171 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 25 12:08:14 crc kubenswrapper[4693]: I1125 12:08:14.723299 4693 trace.go:236] Trace[1740604024]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 12:08:04.721) (total time: 10001ms): Nov 25 12:08:14 crc kubenswrapper[4693]: Trace[1740604024]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:08:14.723) Nov 25 12:08:14 crc kubenswrapper[4693]: Trace[1740604024]: [10.001986356s] [10.001986356s] END Nov 25 12:08:14 crc kubenswrapper[4693]: E1125 12:08:14.723324 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 25 12:08:14 crc kubenswrapper[4693]: I1125 12:08:14.748514 4693 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 25 12:08:15 crc kubenswrapper[4693]: W1125 12:08:15.111429 4693 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 25 12:08:15 crc kubenswrapper[4693]: I1125 12:08:15.111536 4693 trace.go:236] Trace[630183390]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 12:08:05.109) (total time: 10001ms): Nov 25 12:08:15 crc kubenswrapper[4693]: Trace[630183390]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:08:15.111) Nov 25 12:08:15 crc kubenswrapper[4693]: Trace[630183390]: [10.001694038s] [10.001694038s] END Nov 25 12:08:15 crc kubenswrapper[4693]: E1125 12:08:15.111574 4693 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 25 12:08:15 crc kubenswrapper[4693]: I1125 12:08:15.359891 4693 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 25 12:08:15 crc kubenswrapper[4693]: I1125 12:08:15.359960 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 25 12:08:15 crc kubenswrapper[4693]: I1125 12:08:15.363550 4693 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 25 12:08:15 crc kubenswrapper[4693]: I1125 12:08:15.363609 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 25 12:08:15 crc kubenswrapper[4693]: I1125 12:08:15.711177 4693 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]log ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]etcd ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/openshift.io-api-request-count-filter ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/openshift.io-startkubeinformers ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/generic-apiserver-start-informers ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/priority-and-fairness-config-consumer ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/priority-and-fairness-filter ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/start-apiextensions-informers ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/start-apiextensions-controllers ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/crd-informer-synced ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/start-system-namespaces-controller ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/start-cluster-authentication-info-controller ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/start-legacy-token-tracking-controller ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/start-service-ip-repair-controllers ok Nov 25 12:08:15 crc kubenswrapper[4693]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Nov 25 12:08:15 crc kubenswrapper[4693]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/priority-and-fairness-config-producer ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/bootstrap-controller ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/start-kube-aggregator-informers ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/apiservice-status-local-available-controller ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/apiservice-status-remote-available-controller ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/apiservice-registration-controller ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/apiservice-wait-for-first-sync ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/apiservice-discovery-controller ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/kube-apiserver-autoregistration ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]autoregister-completion ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/apiservice-openapi-controller ok Nov 25 12:08:15 crc kubenswrapper[4693]: [+]poststarthook/apiservice-openapiv3-controller ok Nov 25 12:08:15 crc kubenswrapper[4693]: livez check failed Nov 25 12:08:15 crc kubenswrapper[4693]: I1125 12:08:15.711270 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:08:18 crc kubenswrapper[4693]: I1125 12:08:18.765759 4693 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.360157 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.364218 4693 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.364226 4693 trace.go:236] Trace[1727190432]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 12:08:08.944) (total time: 11419ms): Nov 25 12:08:20 crc kubenswrapper[4693]: Trace[1727190432]: ---"Objects listed" error: 11419ms (12:08:20.364) Nov 25 12:08:20 crc kubenswrapper[4693]: Trace[1727190432]: [11.41943917s] [11.41943917s] END Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.364259 4693 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.365776 4693 trace.go:236] Trace[69974450]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (25-Nov-2025 12:08:09.721) (total time: 10643ms): Nov 25 12:08:20 crc kubenswrapper[4693]: Trace[69974450]: ---"Objects listed" error: 10643ms (12:08:20.365) Nov 25 12:08:20 crc kubenswrapper[4693]: Trace[69974450]: [10.64392078s] [10.64392078s] END Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.365797 4693 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.368217 4693 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.437701 4693 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33116->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.437770 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33116->192.168.126.11:17697: read: connection reset by peer" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.526172 4693 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.645613 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.649349 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.708683 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.709265 4693 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.709320 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.713059 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.743997 4693 apiserver.go:52] "Watching apiserver" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.746751 4693 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.747117 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.747550 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.747706 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.747779 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.747883 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.748179 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.748200 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.748220 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.748191 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.748419 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.750451 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.750461 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.750802 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.750827 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.750720 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.750772 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.752133 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.752599 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.752652 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.758170 4693 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.761527 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766082 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766130 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766162 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766184 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766206 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766249 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766273 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766297 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766320 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766342 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766363 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766406 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766430 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766458 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766477 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766497 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766519 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766558 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766579 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766599 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766619 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766642 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766663 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766683 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766704 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766753 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766768 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766783 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766806 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766826 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766844 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766861 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766881 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766900 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766931 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766952 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766973 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.766989 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767004 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767019 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767035 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767051 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767065 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767080 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767102 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767122 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767151 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767170 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767196 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767216 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767278 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767306 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767333 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767353 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767394 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767418 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767438 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767640 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767671 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767713 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767809 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767853 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767870 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767886 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767901 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767920 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767941 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767962 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.767986 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768007 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768027 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768046 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768072 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768124 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768146 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768170 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768195 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768220 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768239 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768255 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768270 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768286 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768308 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768397 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768431 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768450 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768465 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768479 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768494 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768515 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768535 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768561 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768582 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768813 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768870 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768895 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768918 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768959 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.768985 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769026 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769050 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769072 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769094 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769114 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769135 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769157 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769179 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769202 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769225 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769246 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769268 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769288 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769307 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769328 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769350 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769394 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769420 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769443 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769437 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769466 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769488 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769514 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769539 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769561 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769586 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769608 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769631 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769663 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769690 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.769791 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.770264 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.770469 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.770538 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.770770 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.770818 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.770849 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.770891 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.770902 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.770919 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771017 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771044 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771077 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771107 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771129 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771151 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771177 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771203 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771221 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771230 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771248 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771275 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771306 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771337 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771364 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771416 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771603 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771633 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771769 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771803 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771831 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771859 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771887 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771910 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771935 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771963 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771987 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772010 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772034 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772061 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772085 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772142 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772168 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772197 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772223 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772247 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772273 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772298 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772320 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772343 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772395 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772424 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772452 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772479 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772502 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772529 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772554 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772577 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772603 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772625 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772676 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772702 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772725 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772749 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772773 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772798 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772822 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772845 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772870 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772893 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772919 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772951 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772977 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773005 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773028 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773051 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773072 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773111 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773136 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773161 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773191 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773226 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773255 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773280 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773299 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773317 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773338 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773358 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773396 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773415 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773434 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773508 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773525 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773539 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773549 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773559 4693 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773568 4693 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773580 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773591 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773601 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773611 4693 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.775440 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771632 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.782602 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.782703 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.782866 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.783055 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.783338 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.783621 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.783614 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771816 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771914 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771938 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772227 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772253 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772293 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772332 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772354 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772404 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.785689 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772572 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.772910 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773192 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773232 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773408 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773566 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773545 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773649 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.773728 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:08:21.273699737 +0000 UTC m=+21.191785188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.785898 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773749 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773878 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773954 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773981 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.773994 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.774148 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.774245 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.774303 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.774442 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.774845 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.774890 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.775066 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.775178 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.775162 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.775534 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.775560 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.786229 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.775656 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.775660 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.776572 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.776767 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.776861 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.777016 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.777041 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.776877 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.777400 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.777483 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.777516 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.777660 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.786456 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.778061 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.778088 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.778133 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.786512 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.778663 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.778738 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.778767 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.779061 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.779117 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.779128 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.779208 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.779442 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.779265 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.779685 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.786766 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.779789 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.779826 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.779954 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.779967 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.780139 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.780224 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.780529 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.780494 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.780641 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.780700 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.780717 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.780820 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.780827 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.777136 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.781688 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.781770 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.781772 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.781854 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.787030 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.782167 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.782173 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.782529 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.783837 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.783871 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.783916 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.771665 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.784144 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.784206 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.784504 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.784521 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.786624 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.786591 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.786831 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.782570 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.787365 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.787439 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.787500 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.787789 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.787928 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.788077 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.788216 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.788243 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.788435 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.788484 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.788513 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.788638 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.788873 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.788986 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789016 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.785703 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789128 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789264 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789311 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789330 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789482 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789595 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789657 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789679 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789738 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789748 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789787 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789955 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.789978 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.790136 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.790168 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.790198 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.790257 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.790515 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.791112 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.791202 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.791318 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.791404 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.791464 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:21.29141718 +0000 UTC m=+21.209502571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.791507 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.791550 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.791700 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.791486 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.792645 4693 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.792333 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.793830 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.793870 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.794190 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.794434 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.794578 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.795346 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.795557 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.796428 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.796449 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.796543 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.796638 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.796657 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.796767 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:21.296740011 +0000 UTC m=+21.214825612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.802136 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.804662 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.805785 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.807455 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.807984 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.808045 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.808074 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.808196 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:21.308163345 +0000 UTC m=+21.226248756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.809543 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.811464 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.813428 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.813456 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.813475 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.813544 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:21.313523188 +0000 UTC m=+21.231608779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.813743 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.813748 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.814513 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.814671 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.814992 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.815019 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.815062 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.815248 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.815255 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.815359 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.815620 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.815672 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.816668 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.817008 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.817258 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.817553 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.817644 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.817729 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.817853 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.818078 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.818146 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.818209 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.818545 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.823876 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.825225 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.828745 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.829546 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.830662 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.834626 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.836078 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.838758 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.839033 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.840829 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.840944 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.841208 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.841829 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.843133 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.844136 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.845088 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.847253 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.848036 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.849504 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.851109 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.851473 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.851628 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.852826 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.855064 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.855720 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.856616 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.858064 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.858920 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.860800 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.860895 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.861455 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.863189 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.864803 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.865444 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.866718 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.867217 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.868304 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.868933 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.869605 4693 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.869739 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.871753 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.872405 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.873179 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874036 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874075 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874192 4693 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874204 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874214 4693 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874223 4693 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874233 4693 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874244 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874254 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874263 4693 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874272 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874284 4693 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874296 4693 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874307 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874318 4693 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874328 4693 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874337 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874345 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874353 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874361 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874362 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874868 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874916 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874387 4693 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874938 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874948 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874958 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874968 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874977 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874985 4693 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.874993 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875001 4693 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875010 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875019 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875028 4693 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875036 4693 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875044 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875052 4693 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875062 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875073 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875082 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875090 4693 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875098 4693 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875107 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875115 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875123 4693 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875131 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875140 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875148 4693 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875156 4693 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875165 4693 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875172 4693 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875181 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875189 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875198 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875207 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875215 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875223 4693 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875231 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875239 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875248 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875257 4693 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875265 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875272 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875280 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875289 4693 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875296 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875304 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875312 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875320 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875350 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875358 4693 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875366 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875390 4693 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875398 4693 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875406 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875414 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875422 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875430 4693 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875438 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875446 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875454 4693 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875462 4693 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875470 4693 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875478 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875486 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875493 4693 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875501 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875510 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875518 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875526 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875534 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875542 4693 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875551 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875559 4693 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875568 4693 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875577 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875586 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875594 4693 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875603 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875611 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875619 4693 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875627 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875637 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875644 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875653 4693 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875667 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875675 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875684 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875693 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875701 4693 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875709 4693 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875724 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875732 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875741 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875749 4693 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875757 4693 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875765 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875773 4693 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875782 4693 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875790 4693 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875799 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875807 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875814 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875822 4693 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875830 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875838 4693 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875846 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875855 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875863 4693 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875877 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875899 4693 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875909 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875918 4693 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875926 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875934 4693 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875943 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875951 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875959 4693 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875966 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875974 4693 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875983 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875991 4693 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.875999 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876007 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876016 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876024 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876032 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876041 4693 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876048 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876056 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876064 4693 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876071 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876079 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876132 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876312 4693 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876322 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876331 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876338 4693 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876346 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876354 4693 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876362 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876390 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876398 4693 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876407 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876415 4693 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876417 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876424 4693 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876450 4693 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876461 4693 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876491 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876498 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876507 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876517 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876528 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876537 4693 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876546 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876554 4693 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876562 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876571 4693 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876579 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876587 4693 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876595 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876603 4693 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876611 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876620 4693 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876629 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876637 4693 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876646 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.876653 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.877247 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.878417 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.879245 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.880673 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.881217 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.882352 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.882888 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.882972 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.884017 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.884479 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.885440 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.886084 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.887504 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.888097 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.889270 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.889883 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.890522 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.891647 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.891918 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.892153 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.901965 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.919233 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.928893 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.933237 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.935650 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162" exitCode=255 Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.935855 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162"} Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.940443 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.943888 4693 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:08:20 crc kubenswrapper[4693]: E1125 12:08:20.944179 4693 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.944555 4693 scope.go:117] "RemoveContainer" containerID="44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.952895 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.969929 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:20 crc kubenswrapper[4693]: I1125 12:08:20.984933 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.001991 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.013534 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.030303 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.041347 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.056545 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.062654 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.066816 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.069434 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.077050 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 25 12:08:21 crc kubenswrapper[4693]: W1125 12:08:21.082498 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-315dfeefd506a79831b3408c363d018763456833642d8a20c1f456be8a6bc26c WatchSource:0}: Error finding container 315dfeefd506a79831b3408c363d018763456833642d8a20c1f456be8a6bc26c: Status 404 returned error can't find the container with id 315dfeefd506a79831b3408c363d018763456833642d8a20c1f456be8a6bc26c Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.086823 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: W1125 12:08:21.089515 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-9899f5e2a11945f6a7a9642ea77e70ddc3c924a2e3f15190c5e94fc92f332284 WatchSource:0}: Error finding container 9899f5e2a11945f6a7a9642ea77e70ddc3c924a2e3f15190c5e94fc92f332284: Status 404 returned error can't find the container with id 9899f5e2a11945f6a7a9642ea77e70ddc3c924a2e3f15190c5e94fc92f332284 Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.103202 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.119982 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.279902 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.280064 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:08:22.280038123 +0000 UTC m=+22.198123504 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.380524 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.380562 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.380582 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.380602 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.380700 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.380699 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.380738 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.380715 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.380803 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.380780 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:22.380767263 +0000 UTC m=+22.298852644 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.380773 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.380879 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:22.380856145 +0000 UTC m=+22.298941566 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.380886 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.380899 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.380900 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:22.380890106 +0000 UTC m=+22.298975547 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:21 crc kubenswrapper[4693]: E1125 12:08:21.380954 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:22.380938337 +0000 UTC m=+22.299023718 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.939848 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401"} Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.939898 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"467a93f3a0337079a711a96927006c9690afedce6123aa066c44ccc17c4d8102"} Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.941333 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9899f5e2a11945f6a7a9642ea77e70ddc3c924a2e3f15190c5e94fc92f332284"} Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.942882 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e"} Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.942906 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc"} Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.942916 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"315dfeefd506a79831b3408c363d018763456833642d8a20c1f456be8a6bc26c"} Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.945057 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.947833 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535"} Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.948251 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.959365 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.971071 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.979922 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:21 crc kubenswrapper[4693]: I1125 12:08:21.991203 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.004531 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.018629 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.030966 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.043712 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.059126 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.077044 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.092815 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:22Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.111394 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:22Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.130744 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:22Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.147592 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:22Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.190754 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:22Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.210888 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:22Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.291520 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.291748 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:08:24.291715725 +0000 UTC m=+24.209801136 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.393139 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.393187 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.393210 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.393271 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.393303 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.393331 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.393343 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.393352 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.393358 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.393368 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.393364 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.393368 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.393389 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:24.393357162 +0000 UTC m=+24.311442553 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.393526 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:24.393508876 +0000 UTC m=+24.311594247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.393541 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:24.393535367 +0000 UTC m=+24.311620748 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.393550 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:24.393545967 +0000 UTC m=+24.311631348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.812566 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.812618 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:22 crc kubenswrapper[4693]: I1125 12:08:22.812595 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.812724 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.812849 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:22 crc kubenswrapper[4693]: E1125 12:08:22.812926 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:23 crc kubenswrapper[4693]: I1125 12:08:23.955489 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3"} Nov 25 12:08:23 crc kubenswrapper[4693]: I1125 12:08:23.977243 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:23Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:23 crc kubenswrapper[4693]: I1125 12:08:23.990035 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:23Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.003347 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.016902 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.030917 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.041616 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.051536 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.062667 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.199837 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.213406 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.215992 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.218286 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.225048 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.236923 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.249603 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.264024 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.278186 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.291083 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.303602 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.309031 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.309166 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:08:28.309151104 +0000 UTC m=+28.227236485 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.316881 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.337573 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.348875 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.362410 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.377278 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.391749 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.409602 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.409666 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.409711 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.409748 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.409829 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.409850 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.409872 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.409879 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.409898 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.409939 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.409949 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:28.409918085 +0000 UTC m=+28.328003516 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.409957 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.409984 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:28.409966196 +0000 UTC m=+28.328051617 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.409989 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.410122 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:28.41010255 +0000 UTC m=+28.328187971 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.410187 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:28.410137461 +0000 UTC m=+28.328222882 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.415939 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.431221 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.443435 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:24Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.812041 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.812225 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.812284 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:24 crc kubenswrapper[4693]: I1125 12:08:24.812426 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.812512 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:24 crc kubenswrapper[4693]: E1125 12:08:24.812662 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:25 crc kubenswrapper[4693]: I1125 12:08:25.966974 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2gdxx"] Nov 25 12:08:25 crc kubenswrapper[4693]: I1125 12:08:25.967504 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2gdxx" Nov 25 12:08:25 crc kubenswrapper[4693]: I1125 12:08:25.969024 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 25 12:08:25 crc kubenswrapper[4693]: I1125 12:08:25.969137 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 25 12:08:25 crc kubenswrapper[4693]: I1125 12:08:25.969446 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 25 12:08:25 crc kubenswrapper[4693]: I1125 12:08:25.981817 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:25Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:25 crc kubenswrapper[4693]: I1125 12:08:25.992577 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:25Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.007146 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.024452 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjk62\" (UniqueName: \"kubernetes.io/projected/71c80180-d8e5-4615-bb4d-0cd9bea27923-kube-api-access-pjk62\") pod \"node-resolver-2gdxx\" (UID: \"71c80180-d8e5-4615-bb4d-0cd9bea27923\") " pod="openshift-dns/node-resolver-2gdxx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.024514 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/71c80180-d8e5-4615-bb4d-0cd9bea27923-hosts-file\") pod \"node-resolver-2gdxx\" (UID: \"71c80180-d8e5-4615-bb4d-0cd9bea27923\") " pod="openshift-dns/node-resolver-2gdxx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.026751 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.044519 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.059665 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.071356 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.083147 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.095766 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.107584 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.125893 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/71c80180-d8e5-4615-bb4d-0cd9bea27923-hosts-file\") pod \"node-resolver-2gdxx\" (UID: \"71c80180-d8e5-4615-bb4d-0cd9bea27923\") " pod="openshift-dns/node-resolver-2gdxx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.125938 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjk62\" (UniqueName: \"kubernetes.io/projected/71c80180-d8e5-4615-bb4d-0cd9bea27923-kube-api-access-pjk62\") pod \"node-resolver-2gdxx\" (UID: \"71c80180-d8e5-4615-bb4d-0cd9bea27923\") " pod="openshift-dns/node-resolver-2gdxx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.126072 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/71c80180-d8e5-4615-bb4d-0cd9bea27923-hosts-file\") pod \"node-resolver-2gdxx\" (UID: \"71c80180-d8e5-4615-bb4d-0cd9bea27923\") " pod="openshift-dns/node-resolver-2gdxx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.144784 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjk62\" (UniqueName: \"kubernetes.io/projected/71c80180-d8e5-4615-bb4d-0cd9bea27923-kube-api-access-pjk62\") pod \"node-resolver-2gdxx\" (UID: \"71c80180-d8e5-4615-bb4d-0cd9bea27923\") " pod="openshift-dns/node-resolver-2gdxx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.280033 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2gdxx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.356093 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6l9jx"] Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.356714 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-6d66d"] Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.356920 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.357040 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.358898 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.359056 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.359193 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.359277 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.360160 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.360560 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5rpbd"] Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.360726 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.360841 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.361054 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.362014 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sn9jm"] Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.362664 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.362677 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.363537 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.365589 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.369487 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.369997 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.371242 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.371706 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.371871 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.371988 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.372262 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.372388 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.374818 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.377346 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.391569 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.415577 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429570 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-multus-socket-dir-parent\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429607 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-etc-kubernetes\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429624 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-slash\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429640 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-run-netns\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429657 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-ovnkube-config\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429674 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-cnibin\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429731 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-system-cni-dir\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429758 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-os-release\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429781 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e8271578-d9e5-4777-8689-da8dd38edfb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429804 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-run-k8s-cni-cncf-io\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429823 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f238a1e7-499b-466f-b643-bef0ae6f5e5f-rootfs\") pod \"machine-config-daemon-6d66d\" (UID: \"f238a1e7-499b-466f-b643-bef0ae6f5e5f\") " pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429848 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-multus-conf-dir\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429862 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f714b419-cf37-48b7-9b1a-d36291d788a0-multus-daemon-config\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429876 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pw97\" (UniqueName: \"kubernetes.io/projected/f238a1e7-499b-466f-b643-bef0ae6f5e5f-kube-api-access-5pw97\") pod \"machine-config-daemon-6d66d\" (UID: \"f238a1e7-499b-466f-b643-bef0ae6f5e5f\") " pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429895 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-ovnkube-script-lib\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.429911 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f714b419-cf37-48b7-9b1a-d36291d788a0-cni-binary-copy\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430164 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e8271578-d9e5-4777-8689-da8dd38edfb6-os-release\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430207 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e8271578-d9e5-4777-8689-da8dd38edfb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430226 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-var-lib-cni-multus\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430240 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q5dx\" (UniqueName: \"kubernetes.io/projected/f714b419-cf37-48b7-9b1a-d36291d788a0-kube-api-access-4q5dx\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430258 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95qrg\" (UniqueName: \"kubernetes.io/projected/e8271578-d9e5-4777-8689-da8dd38edfb6-kube-api-access-95qrg\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430276 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-etc-openvswitch\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430291 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-cni-bin\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430306 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-cni-netd\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430320 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-env-overrides\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430336 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-multus-cni-dir\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430349 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-run-multus-certs\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430391 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e8271578-d9e5-4777-8689-da8dd38edfb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430413 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-systemd\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430438 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-node-log\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430462 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-var-lib-kubelet\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430515 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-var-lib-openvswitch\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430557 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430575 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlfn9\" (UniqueName: \"kubernetes.io/projected/4c247f7d-6187-4052-baee-5c5841e1d9da-kube-api-access-dlfn9\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430617 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e8271578-d9e5-4777-8689-da8dd38edfb6-system-cni-dir\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430632 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-openvswitch\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430647 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-log-socket\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430674 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f238a1e7-499b-466f-b643-bef0ae6f5e5f-mcd-auth-proxy-config\") pod \"machine-config-daemon-6d66d\" (UID: \"f238a1e7-499b-466f-b643-bef0ae6f5e5f\") " pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430689 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-run-ovn-kubernetes\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430704 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e8271578-d9e5-4777-8689-da8dd38edfb6-cnibin\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430737 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-run-netns\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430754 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-var-lib-cni-bin\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430768 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-hostroot\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430781 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f238a1e7-499b-466f-b643-bef0ae6f5e5f-proxy-tls\") pod \"machine-config-daemon-6d66d\" (UID: \"f238a1e7-499b-466f-b643-bef0ae6f5e5f\") " pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430796 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-kubelet\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430825 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-systemd-units\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430845 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-ovn\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.430859 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c247f7d-6187-4052-baee-5c5841e1d9da-ovn-node-metrics-cert\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.446765 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.458816 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.489696 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.506057 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.528202 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531532 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-run-netns\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531581 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-var-lib-cni-bin\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531605 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-hostroot\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531630 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e8271578-d9e5-4777-8689-da8dd38edfb6-cnibin\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531661 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-run-netns\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531666 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f238a1e7-499b-466f-b643-bef0ae6f5e5f-proxy-tls\") pod \"machine-config-daemon-6d66d\" (UID: \"f238a1e7-499b-466f-b643-bef0ae6f5e5f\") " pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531700 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-var-lib-cni-bin\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531727 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-kubelet\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531750 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-kubelet\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531773 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-ovn\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531799 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e8271578-d9e5-4777-8689-da8dd38edfb6-cnibin\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531800 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c247f7d-6187-4052-baee-5c5841e1d9da-ovn-node-metrics-cert\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531828 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-systemd-units\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531843 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-slash\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531858 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-run-netns\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531873 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-ovnkube-config\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531888 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-multus-socket-dir-parent\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531905 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-etc-kubernetes\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531923 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-system-cni-dir\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531937 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-cnibin\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531954 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-os-release\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531968 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e8271578-d9e5-4777-8689-da8dd38edfb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531984 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-run-k8s-cni-cncf-io\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532000 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f238a1e7-499b-466f-b643-bef0ae6f5e5f-rootfs\") pod \"machine-config-daemon-6d66d\" (UID: \"f238a1e7-499b-466f-b643-bef0ae6f5e5f\") " pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532013 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-multus-conf-dir\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532026 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f714b419-cf37-48b7-9b1a-d36291d788a0-multus-daemon-config\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532042 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pw97\" (UniqueName: \"kubernetes.io/projected/f238a1e7-499b-466f-b643-bef0ae6f5e5f-kube-api-access-5pw97\") pod \"machine-config-daemon-6d66d\" (UID: \"f238a1e7-499b-466f-b643-bef0ae6f5e5f\") " pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532067 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-ovnkube-script-lib\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532082 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e8271578-d9e5-4777-8689-da8dd38edfb6-os-release\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532096 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e8271578-d9e5-4777-8689-da8dd38edfb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532111 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f714b419-cf37-48b7-9b1a-d36291d788a0-cni-binary-copy\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532126 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95qrg\" (UniqueName: \"kubernetes.io/projected/e8271578-d9e5-4777-8689-da8dd38edfb6-kube-api-access-95qrg\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532140 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-etc-openvswitch\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532154 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-cni-bin\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532170 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-cni-netd\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532185 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-var-lib-cni-multus\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532200 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q5dx\" (UniqueName: \"kubernetes.io/projected/f714b419-cf37-48b7-9b1a-d36291d788a0-kube-api-access-4q5dx\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532214 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-run-multus-certs\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532228 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e8271578-d9e5-4777-8689-da8dd38edfb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532242 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-systemd\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532266 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-node-log\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532282 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-env-overrides\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532281 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-multus-conf-dir\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532298 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-multus-cni-dir\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532354 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532399 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-var-lib-kubelet\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532425 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-var-lib-openvswitch\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532445 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlfn9\" (UniqueName: \"kubernetes.io/projected/4c247f7d-6187-4052-baee-5c5841e1d9da-kube-api-access-dlfn9\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532462 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-multus-cni-dir\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532467 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e8271578-d9e5-4777-8689-da8dd38edfb6-system-cni-dir\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532492 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e8271578-d9e5-4777-8689-da8dd38edfb6-system-cni-dir\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532523 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-openvswitch\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532556 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-systemd-units\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532586 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-slash\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532617 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-run-netns\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.533051 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f714b419-cf37-48b7-9b1a-d36291d788a0-multus-daemon-config\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.533234 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-ovnkube-config\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.533310 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-multus-socket-dir-parent\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.533353 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-etc-kubernetes\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.533415 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-system-cni-dir\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.533466 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-cnibin\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.533774 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-ovnkube-script-lib\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.533834 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e8271578-d9e5-4777-8689-da8dd38edfb6-os-release\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.533944 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-var-lib-cni-multus\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534326 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e8271578-d9e5-4777-8689-da8dd38edfb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534402 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-run-k8s-cni-cncf-io\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534446 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f238a1e7-499b-466f-b643-bef0ae6f5e5f-rootfs\") pod \"machine-config-daemon-6d66d\" (UID: \"f238a1e7-499b-466f-b643-bef0ae6f5e5f\") " pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534478 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534509 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-var-lib-kubelet\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534518 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e8271578-d9e5-4777-8689-da8dd38edfb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534538 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-var-lib-openvswitch\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534563 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-etc-openvswitch\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.532500 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-openvswitch\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534747 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-log-socket\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534788 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f238a1e7-499b-466f-b643-bef0ae6f5e5f-mcd-auth-proxy-config\") pod \"machine-config-daemon-6d66d\" (UID: \"f238a1e7-499b-466f-b643-bef0ae6f5e5f\") " pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534812 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-run-ovn-kubernetes\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534868 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-run-ovn-kubernetes\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534900 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-log-socket\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534909 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f714b419-cf37-48b7-9b1a-d36291d788a0-cni-binary-copy\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534950 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-node-log\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.534977 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-systemd\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.535000 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-cni-bin\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.535022 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-cni-netd\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.535156 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-host-run-multus-certs\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.535511 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f238a1e7-499b-466f-b643-bef0ae6f5e5f-mcd-auth-proxy-config\") pod \"machine-config-daemon-6d66d\" (UID: \"f238a1e7-499b-466f-b643-bef0ae6f5e5f\") " pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.535546 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-env-overrides\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.531779 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-hostroot\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.535579 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-ovn\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.535630 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f714b419-cf37-48b7-9b1a-d36291d788a0-os-release\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.535950 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e8271578-d9e5-4777-8689-da8dd38edfb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.537881 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f238a1e7-499b-466f-b643-bef0ae6f5e5f-proxy-tls\") pod \"machine-config-daemon-6d66d\" (UID: \"f238a1e7-499b-466f-b643-bef0ae6f5e5f\") " pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.541900 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c247f7d-6187-4052-baee-5c5841e1d9da-ovn-node-metrics-cert\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.552538 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlfn9\" (UniqueName: \"kubernetes.io/projected/4c247f7d-6187-4052-baee-5c5841e1d9da-kube-api-access-dlfn9\") pod \"ovnkube-node-sn9jm\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.552664 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.555736 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95qrg\" (UniqueName: \"kubernetes.io/projected/e8271578-d9e5-4777-8689-da8dd38edfb6-kube-api-access-95qrg\") pod \"multus-additional-cni-plugins-5rpbd\" (UID: \"e8271578-d9e5-4777-8689-da8dd38edfb6\") " pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.557511 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pw97\" (UniqueName: \"kubernetes.io/projected/f238a1e7-499b-466f-b643-bef0ae6f5e5f-kube-api-access-5pw97\") pod \"machine-config-daemon-6d66d\" (UID: \"f238a1e7-499b-466f-b643-bef0ae6f5e5f\") " pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.557758 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q5dx\" (UniqueName: \"kubernetes.io/projected/f714b419-cf37-48b7-9b1a-d36291d788a0-kube-api-access-4q5dx\") pod \"multus-6l9jx\" (UID: \"f714b419-cf37-48b7-9b1a-d36291d788a0\") " pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.573951 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.587358 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.597669 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.609774 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.622710 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.634730 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.650330 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.660810 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.674878 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6l9jx" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.677660 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.682334 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:08:26 crc kubenswrapper[4693]: W1125 12:08:26.683902 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf714b419_cf37_48b7_9b1a_d36291d788a0.slice/crio-42bdc09de66047fc7afee77507cb8d91edab7566a4fedb152a104298ff3467de WatchSource:0}: Error finding container 42bdc09de66047fc7afee77507cb8d91edab7566a4fedb152a104298ff3467de: Status 404 returned error can't find the container with id 42bdc09de66047fc7afee77507cb8d91edab7566a4fedb152a104298ff3467de Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.687958 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.692784 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.703519 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.703520 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: W1125 12:08:26.709843 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8271578_d9e5_4777_8689_da8dd38edfb6.slice/crio-5c50903ba56a01b2360ed3b5e1eb8c1f48520ab503d71f1b8c0119f51f56302a WatchSource:0}: Error finding container 5c50903ba56a01b2360ed3b5e1eb8c1f48520ab503d71f1b8c0119f51f56302a: Status 404 returned error can't find the container with id 5c50903ba56a01b2360ed3b5e1eb8c1f48520ab503d71f1b8c0119f51f56302a Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.717429 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: W1125 12:08:26.728952 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c247f7d_6187_4052_baee_5c5841e1d9da.slice/crio-91900e5c13c852ec9260b6bedac82c3a81d05e42bece88af1e7b8adace084239 WatchSource:0}: Error finding container 91900e5c13c852ec9260b6bedac82c3a81d05e42bece88af1e7b8adace084239: Status 404 returned error can't find the container with id 91900e5c13c852ec9260b6bedac82c3a81d05e42bece88af1e7b8adace084239 Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.737231 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.752978 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.766412 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.769000 4693 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.771287 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.771328 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.771341 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.771540 4693 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.777990 4693 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.778275 4693 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.780179 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.780209 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.780219 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.780756 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.780778 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:26Z","lastTransitionTime":"2025-11-25T12:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.781987 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.807676 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: E1125 12:08:26.812169 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.812618 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.812656 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.812624 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:26 crc kubenswrapper[4693]: E1125 12:08:26.812783 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:26 crc kubenswrapper[4693]: E1125 12:08:26.813533 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:26 crc kubenswrapper[4693]: E1125 12:08:26.813413 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.816431 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.816474 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.816488 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.816507 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.816520 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:26Z","lastTransitionTime":"2025-11-25T12:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:26 crc kubenswrapper[4693]: E1125 12:08:26.831464 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.834242 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.834278 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.834289 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.834305 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.834319 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:26Z","lastTransitionTime":"2025-11-25T12:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:26 crc kubenswrapper[4693]: E1125 12:08:26.845273 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.848819 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.848874 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.848892 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.848913 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.848929 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:26Z","lastTransitionTime":"2025-11-25T12:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:26 crc kubenswrapper[4693]: E1125 12:08:26.861933 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.864960 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.864998 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.865010 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.865025 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.865038 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:26Z","lastTransitionTime":"2025-11-25T12:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:26 crc kubenswrapper[4693]: E1125 12:08:26.874862 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:26 crc kubenswrapper[4693]: E1125 12:08:26.875117 4693 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.876614 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.876666 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.876682 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.876706 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.876724 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:26Z","lastTransitionTime":"2025-11-25T12:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.964657 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.964695 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.964705 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"32eb77b1a9cd5a8cd5a12ef43f14cbc85fb47c03c5b9f7c561b4caa52c670322"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.967234 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6l9jx" event={"ID":"f714b419-cf37-48b7-9b1a-d36291d788a0","Type":"ContainerStarted","Data":"79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.967264 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6l9jx" event={"ID":"f714b419-cf37-48b7-9b1a-d36291d788a0","Type":"ContainerStarted","Data":"42bdc09de66047fc7afee77507cb8d91edab7566a4fedb152a104298ff3467de"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.968754 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2gdxx" event={"ID":"71c80180-d8e5-4615-bb4d-0cd9bea27923","Type":"ContainerStarted","Data":"3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.968777 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2gdxx" event={"ID":"71c80180-d8e5-4615-bb4d-0cd9bea27923","Type":"ContainerStarted","Data":"4822108cf11a15f581541035dc3155e5fb404b750ee48843d512645655e6075f"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.980068 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.980107 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.980116 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.980130 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.980140 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:26Z","lastTransitionTime":"2025-11-25T12:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.980791 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778" exitCode=0 Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.980855 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.980890 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"91900e5c13c852ec9260b6bedac82c3a81d05e42bece88af1e7b8adace084239"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.982490 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" event={"ID":"e8271578-d9e5-4777-8689-da8dd38edfb6","Type":"ContainerStarted","Data":"ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.982520 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" event={"ID":"e8271578-d9e5-4777-8689-da8dd38edfb6","Type":"ContainerStarted","Data":"5c50903ba56a01b2360ed3b5e1eb8c1f48520ab503d71f1b8c0119f51f56302a"} Nov 25 12:08:26 crc kubenswrapper[4693]: I1125 12:08:26.995023 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:26Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.026852 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.044185 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.056846 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.067832 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.082149 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.082188 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.082200 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.082216 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.082228 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:27Z","lastTransitionTime":"2025-11-25T12:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.083102 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.095060 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.111276 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.127677 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.140226 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.159263 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.173484 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.184353 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.184405 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.184416 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.184437 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.184455 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:27Z","lastTransitionTime":"2025-11-25T12:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.188728 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.200547 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.212365 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.225818 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.237500 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.250946 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.262299 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.282049 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.286533 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.286596 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.286614 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.286639 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.286657 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:27Z","lastTransitionTime":"2025-11-25T12:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.296152 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.312045 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.326263 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.340977 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.361393 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.373328 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.383935 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.390290 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.390334 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.390346 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.390366 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.390406 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:27Z","lastTransitionTime":"2025-11-25T12:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.395218 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:27Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.492772 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.492817 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.492830 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.492849 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.492861 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:27Z","lastTransitionTime":"2025-11-25T12:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.594633 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.594679 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.594690 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.594709 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.594724 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:27Z","lastTransitionTime":"2025-11-25T12:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.697707 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.697750 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.697760 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.697774 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.697783 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:27Z","lastTransitionTime":"2025-11-25T12:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.800718 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.800776 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.800789 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.800810 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.800824 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:27Z","lastTransitionTime":"2025-11-25T12:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.903499 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.903536 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.903550 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.903566 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.903577 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:27Z","lastTransitionTime":"2025-11-25T12:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.989823 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.990109 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.990122 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.990130 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.990138 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.990147 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0"} Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.993056 4693 generic.go:334] "Generic (PLEG): container finished" podID="e8271578-d9e5-4777-8689-da8dd38edfb6" containerID="ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5" exitCode=0 Nov 25 12:08:27 crc kubenswrapper[4693]: I1125 12:08:27.993193 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" event={"ID":"e8271578-d9e5-4777-8689-da8dd38edfb6","Type":"ContainerDied","Data":"ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5"} Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.006912 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.006957 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.006967 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.006981 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.006991 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:28Z","lastTransitionTime":"2025-11-25T12:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.007958 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.029609 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.043119 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.053950 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.064737 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.078776 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.092983 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.105782 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.108925 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.108976 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.108988 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.109005 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.109017 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:28Z","lastTransitionTime":"2025-11-25T12:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.119018 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.131966 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.143756 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.161689 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.170013 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.180872 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.212623 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.212666 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.212677 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.212693 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.212703 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:28Z","lastTransitionTime":"2025-11-25T12:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.315321 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.315362 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.315397 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.315415 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.315426 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:28Z","lastTransitionTime":"2025-11-25T12:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.347272 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.347517 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:08:36.34749233 +0000 UTC m=+36.265577711 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.418985 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.419026 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.419037 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.419054 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.419067 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:28Z","lastTransitionTime":"2025-11-25T12:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.448576 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.448618 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.448636 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.448660 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.448700 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.448716 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.448758 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.448773 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.448780 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.448788 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.448795 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.448775 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:36.448758235 +0000 UTC m=+36.366843626 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.448821 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:36.448808706 +0000 UTC m=+36.366894087 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.448798 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.448832 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:36.448826507 +0000 UTC m=+36.366911888 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.448860 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:36.448849037 +0000 UTC m=+36.366934478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.521166 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.521224 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.521244 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.521267 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.521283 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:28Z","lastTransitionTime":"2025-11-25T12:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.561523 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-xl6bh"] Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.562328 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xl6bh" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.564367 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.565168 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.565441 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.565547 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.591547 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.602441 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.617871 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.624240 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.624287 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.624300 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.624320 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.624332 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:28Z","lastTransitionTime":"2025-11-25T12:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.632126 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.651447 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/399a78ed-8e91-43cb-89c1-6732d17d6e41-serviceca\") pod \"node-ca-xl6bh\" (UID: \"399a78ed-8e91-43cb-89c1-6732d17d6e41\") " pod="openshift-image-registry/node-ca-xl6bh" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.651523 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtj7j\" (UniqueName: \"kubernetes.io/projected/399a78ed-8e91-43cb-89c1-6732d17d6e41-kube-api-access-dtj7j\") pod \"node-ca-xl6bh\" (UID: \"399a78ed-8e91-43cb-89c1-6732d17d6e41\") " pod="openshift-image-registry/node-ca-xl6bh" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.651561 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/399a78ed-8e91-43cb-89c1-6732d17d6e41-host\") pod \"node-ca-xl6bh\" (UID: \"399a78ed-8e91-43cb-89c1-6732d17d6e41\") " pod="openshift-image-registry/node-ca-xl6bh" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.655646 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.670613 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.684741 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.696030 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.711713 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.724297 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.726329 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.726404 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.726417 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.726443 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.726457 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:28Z","lastTransitionTime":"2025-11-25T12:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.739202 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.752325 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/399a78ed-8e91-43cb-89c1-6732d17d6e41-serviceca\") pod \"node-ca-xl6bh\" (UID: \"399a78ed-8e91-43cb-89c1-6732d17d6e41\") " pod="openshift-image-registry/node-ca-xl6bh" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.752417 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtj7j\" (UniqueName: \"kubernetes.io/projected/399a78ed-8e91-43cb-89c1-6732d17d6e41-kube-api-access-dtj7j\") pod \"node-ca-xl6bh\" (UID: \"399a78ed-8e91-43cb-89c1-6732d17d6e41\") " pod="openshift-image-registry/node-ca-xl6bh" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.752448 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/399a78ed-8e91-43cb-89c1-6732d17d6e41-host\") pod \"node-ca-xl6bh\" (UID: \"399a78ed-8e91-43cb-89c1-6732d17d6e41\") " pod="openshift-image-registry/node-ca-xl6bh" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.752529 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/399a78ed-8e91-43cb-89c1-6732d17d6e41-host\") pod \"node-ca-xl6bh\" (UID: \"399a78ed-8e91-43cb-89c1-6732d17d6e41\") " pod="openshift-image-registry/node-ca-xl6bh" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.754395 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/399a78ed-8e91-43cb-89c1-6732d17d6e41-serviceca\") pod \"node-ca-xl6bh\" (UID: \"399a78ed-8e91-43cb-89c1-6732d17d6e41\") " pod="openshift-image-registry/node-ca-xl6bh" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.757321 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.776631 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.777335 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtj7j\" (UniqueName: \"kubernetes.io/projected/399a78ed-8e91-43cb-89c1-6732d17d6e41-kube-api-access-dtj7j\") pod \"node-ca-xl6bh\" (UID: \"399a78ed-8e91-43cb-89c1-6732d17d6e41\") " pod="openshift-image-registry/node-ca-xl6bh" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.790202 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.800711 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.811867 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.811907 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.811866 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.812032 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.812178 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:28 crc kubenswrapper[4693]: E1125 12:08:28.812391 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.829193 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.829263 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.829275 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.829296 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.829331 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:28Z","lastTransitionTime":"2025-11-25T12:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.881229 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xl6bh" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.931737 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.931776 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.931789 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.931804 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.931815 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:28Z","lastTransitionTime":"2025-11-25T12:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.997495 4693 generic.go:334] "Generic (PLEG): container finished" podID="e8271578-d9e5-4777-8689-da8dd38edfb6" containerID="08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754" exitCode=0 Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.997612 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" event={"ID":"e8271578-d9e5-4777-8689-da8dd38edfb6","Type":"ContainerDied","Data":"08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754"} Nov 25 12:08:28 crc kubenswrapper[4693]: I1125 12:08:28.998477 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xl6bh" event={"ID":"399a78ed-8e91-43cb-89c1-6732d17d6e41","Type":"ContainerStarted","Data":"8534d0cdf68d5eb8f23479f5178ed92f69f87a123d881018a221dadb49b91629"} Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.016331 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.029261 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.034210 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.034247 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.034260 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.034276 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.034287 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:29Z","lastTransitionTime":"2025-11-25T12:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.041816 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.055136 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.069160 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.082089 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.097010 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.111442 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.123449 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.136340 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.136428 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.136439 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.136452 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.136460 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:29Z","lastTransitionTime":"2025-11-25T12:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.137268 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.147542 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.161447 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.196099 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.211028 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.237389 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.238819 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.238846 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.238856 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.238871 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.238881 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:29Z","lastTransitionTime":"2025-11-25T12:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.340530 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.340571 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.340583 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.340598 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.340609 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:29Z","lastTransitionTime":"2025-11-25T12:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.442972 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.443018 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.443029 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.443047 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.443061 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:29Z","lastTransitionTime":"2025-11-25T12:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.545282 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.545564 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.545675 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.545769 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.545859 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:29Z","lastTransitionTime":"2025-11-25T12:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.648693 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.648724 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.648732 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.648745 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.648755 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:29Z","lastTransitionTime":"2025-11-25T12:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.750152 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.750187 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.750195 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.750209 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.750225 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:29Z","lastTransitionTime":"2025-11-25T12:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.853360 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.853436 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.853451 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.853469 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.853486 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:29Z","lastTransitionTime":"2025-11-25T12:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.955736 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.955771 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.955781 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.955796 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:29 crc kubenswrapper[4693]: I1125 12:08:29.955806 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:29Z","lastTransitionTime":"2025-11-25T12:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.005805 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.009541 4693 generic.go:334] "Generic (PLEG): container finished" podID="e8271578-d9e5-4777-8689-da8dd38edfb6" containerID="b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64" exitCode=0 Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.009630 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" event={"ID":"e8271578-d9e5-4777-8689-da8dd38edfb6","Type":"ContainerDied","Data":"b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.011168 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xl6bh" event={"ID":"399a78ed-8e91-43cb-89c1-6732d17d6e41","Type":"ContainerStarted","Data":"4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.034993 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.054159 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.058680 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.058722 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.058735 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.058750 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.058765 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:30Z","lastTransitionTime":"2025-11-25T12:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.069279 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.082015 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.100460 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.116758 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.132469 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.144513 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.161186 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.161235 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.161246 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.161263 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.161274 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:30Z","lastTransitionTime":"2025-11-25T12:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.164617 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.177707 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.188144 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.204968 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.225775 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.238155 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.251931 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.263084 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.263117 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.263128 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.263143 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.263154 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:30Z","lastTransitionTime":"2025-11-25T12:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.266703 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.280200 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.292132 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.303839 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.314512 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.324438 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.340891 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.351763 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.366191 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.366230 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.366241 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.366256 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.366268 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:30Z","lastTransitionTime":"2025-11-25T12:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.367535 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.382887 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.412491 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.428697 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.448588 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.462792 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.468875 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.468914 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.468928 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.468949 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.468965 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:30Z","lastTransitionTime":"2025-11-25T12:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.478203 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.598632 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.598692 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.598707 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.598729 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.598742 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:30Z","lastTransitionTime":"2025-11-25T12:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.701868 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.701918 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.701931 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.701948 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.701962 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:30Z","lastTransitionTime":"2025-11-25T12:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.804216 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.804253 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.804264 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.804282 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.804294 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:30Z","lastTransitionTime":"2025-11-25T12:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.812093 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.812114 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.812093 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:30 crc kubenswrapper[4693]: E1125 12:08:30.812298 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:30 crc kubenswrapper[4693]: E1125 12:08:30.812198 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:30 crc kubenswrapper[4693]: E1125 12:08:30.812415 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.827284 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.841186 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.852440 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.862461 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.876671 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.887355 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.905944 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.905990 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.906004 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.906024 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.906048 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:30Z","lastTransitionTime":"2025-11-25T12:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.908017 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.924235 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.940182 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.950553 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.963252 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.980346 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:30 crc kubenswrapper[4693]: I1125 12:08:30.993004 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.005342 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.008184 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.008231 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.008244 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.008262 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.008275 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:31Z","lastTransitionTime":"2025-11-25T12:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.015995 4693 generic.go:334] "Generic (PLEG): container finished" podID="e8271578-d9e5-4777-8689-da8dd38edfb6" containerID="755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2" exitCode=0 Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.016068 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" event={"ID":"e8271578-d9e5-4777-8689-da8dd38edfb6","Type":"ContainerDied","Data":"755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2"} Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.028463 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.040540 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.053086 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.067992 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.087234 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.099108 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.110706 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.111405 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.111458 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.111487 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.111513 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.111529 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:31Z","lastTransitionTime":"2025-11-25T12:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.119735 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.131222 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.146533 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.176594 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.214489 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.214565 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.214590 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.214620 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.214643 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:31Z","lastTransitionTime":"2025-11-25T12:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.218059 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.260457 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.299350 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.317405 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.317450 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.317465 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.317485 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.317500 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:31Z","lastTransitionTime":"2025-11-25T12:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.338469 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.387060 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.420270 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.420304 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.420311 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.420326 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.420336 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:31Z","lastTransitionTime":"2025-11-25T12:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.522952 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.523024 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.523042 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.523065 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.523082 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:31Z","lastTransitionTime":"2025-11-25T12:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.625967 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.626031 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.626044 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.626059 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.626069 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:31Z","lastTransitionTime":"2025-11-25T12:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.729443 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.729526 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.729550 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.729581 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.729599 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:31Z","lastTransitionTime":"2025-11-25T12:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.860451 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.860499 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.860516 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.860538 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.860554 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:31Z","lastTransitionTime":"2025-11-25T12:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.968189 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.968231 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.968242 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.968257 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:31 crc kubenswrapper[4693]: I1125 12:08:31.968268 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:31Z","lastTransitionTime":"2025-11-25T12:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.021847 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" event={"ID":"e8271578-d9e5-4777-8689-da8dd38edfb6","Type":"ContainerStarted","Data":"640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff"} Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.035747 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.065791 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.070943 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.070980 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.070988 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.071001 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.071010 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:32Z","lastTransitionTime":"2025-11-25T12:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.083033 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.095609 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.109961 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.126769 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.143331 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.157715 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.172424 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.173083 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.173120 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.173152 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.173170 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.173179 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:32Z","lastTransitionTime":"2025-11-25T12:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.186883 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.197198 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.205735 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.234509 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.249358 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.274995 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:32Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.276304 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.276343 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.276360 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.276428 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.276450 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:32Z","lastTransitionTime":"2025-11-25T12:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.379279 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.379322 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.379332 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.379347 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.379357 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:32Z","lastTransitionTime":"2025-11-25T12:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.483275 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.483345 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.483362 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.483414 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.483433 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:32Z","lastTransitionTime":"2025-11-25T12:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.586417 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.586498 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.586510 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.586533 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.586547 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:32Z","lastTransitionTime":"2025-11-25T12:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.690024 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.690088 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.690100 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.690117 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.690130 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:32Z","lastTransitionTime":"2025-11-25T12:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.793633 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.793720 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.793744 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.793773 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.793798 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:32Z","lastTransitionTime":"2025-11-25T12:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.812357 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:32 crc kubenswrapper[4693]: E1125 12:08:32.812579 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.812802 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:32 crc kubenswrapper[4693]: E1125 12:08:32.812977 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.813187 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:32 crc kubenswrapper[4693]: E1125 12:08:32.813530 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.897220 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.897259 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.897272 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.897290 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:32 crc kubenswrapper[4693]: I1125 12:08:32.897302 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:32Z","lastTransitionTime":"2025-11-25T12:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.000577 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.000628 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.000641 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.000661 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.000675 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:33Z","lastTransitionTime":"2025-11-25T12:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.105037 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.105094 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.105113 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.105139 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.105158 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:33Z","lastTransitionTime":"2025-11-25T12:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.208279 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.208321 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.208335 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.208351 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.208362 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:33Z","lastTransitionTime":"2025-11-25T12:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.310762 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.310799 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.310810 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.310827 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.310841 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:33Z","lastTransitionTime":"2025-11-25T12:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.413502 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.413547 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.413564 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.413586 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.413603 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:33Z","lastTransitionTime":"2025-11-25T12:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.515794 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.515871 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.515882 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.515898 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.515909 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:33Z","lastTransitionTime":"2025-11-25T12:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.619112 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.619156 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.619173 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.619200 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.619213 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:33Z","lastTransitionTime":"2025-11-25T12:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.721291 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.721327 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.721347 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.721365 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.721393 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:33Z","lastTransitionTime":"2025-11-25T12:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.823927 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.823985 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.824001 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.824025 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.824041 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:33Z","lastTransitionTime":"2025-11-25T12:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.926854 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.926894 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.926910 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.926931 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:33 crc kubenswrapper[4693]: I1125 12:08:33.926947 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:33Z","lastTransitionTime":"2025-11-25T12:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.028932 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.029163 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.029266 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.029345 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.029457 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:34Z","lastTransitionTime":"2025-11-25T12:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.035297 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb"} Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.039362 4693 generic.go:334] "Generic (PLEG): container finished" podID="e8271578-d9e5-4777-8689-da8dd38edfb6" containerID="640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff" exitCode=0 Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.039434 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" event={"ID":"e8271578-d9e5-4777-8689-da8dd38edfb6","Type":"ContainerDied","Data":"640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff"} Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.052051 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.071467 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.089597 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.105876 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.125533 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.132417 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.132570 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.132639 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.132713 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.132743 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:34Z","lastTransitionTime":"2025-11-25T12:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.154760 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.170047 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.201567 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.220240 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.235438 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.235712 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.236019 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.236331 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.236638 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:34Z","lastTransitionTime":"2025-11-25T12:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.237190 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.252517 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.264715 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.281992 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.296271 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.317503 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.329836 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.339837 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.339889 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.339902 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.339923 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.339944 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:34Z","lastTransitionTime":"2025-11-25T12:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.360805 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.387424 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.405345 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.417144 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.431882 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.443083 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.443115 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.443124 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.443139 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.443150 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:34Z","lastTransitionTime":"2025-11-25T12:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.445308 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.459168 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.471596 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.483436 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.495762 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.512103 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.534229 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.543782 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.545075 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.545115 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.545127 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.545144 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.545155 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:34Z","lastTransitionTime":"2025-11-25T12:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.560954 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:34Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.648143 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.648183 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.648193 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.648208 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.648217 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:34Z","lastTransitionTime":"2025-11-25T12:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.751820 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.751885 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.751902 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.751926 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.751944 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:34Z","lastTransitionTime":"2025-11-25T12:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.812290 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.812692 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.812498 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:34 crc kubenswrapper[4693]: E1125 12:08:34.812867 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:34 crc kubenswrapper[4693]: E1125 12:08:34.812977 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:34 crc kubenswrapper[4693]: E1125 12:08:34.813138 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.854583 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.854653 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.854668 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.854687 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.854699 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:34Z","lastTransitionTime":"2025-11-25T12:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.957697 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.957755 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.957773 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.957797 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:34 crc kubenswrapper[4693]: I1125 12:08:34.957815 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:34Z","lastTransitionTime":"2025-11-25T12:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.052031 4693 generic.go:334] "Generic (PLEG): container finished" podID="e8271578-d9e5-4777-8689-da8dd38edfb6" containerID="a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de" exitCode=0 Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.052189 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.052655 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" event={"ID":"e8271578-d9e5-4777-8689-da8dd38edfb6","Type":"ContainerDied","Data":"a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de"} Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.052733 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.053159 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.060884 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.060941 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.060966 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.060995 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.061018 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:35Z","lastTransitionTime":"2025-11-25T12:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.090450 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.096198 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.102255 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.109616 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.122853 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.137621 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.156310 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.163531 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.163582 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.163605 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.163633 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.163653 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:35Z","lastTransitionTime":"2025-11-25T12:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.172899 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.192754 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.208789 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.226029 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.243216 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.257599 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.265608 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.265658 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.265680 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.265707 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.265725 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:35Z","lastTransitionTime":"2025-11-25T12:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.271645 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.288824 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.297585 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.308754 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.325431 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.339361 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.350569 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.363990 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.372857 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.372885 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.372895 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.372910 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.372924 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:35Z","lastTransitionTime":"2025-11-25T12:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.390200 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.406724 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.420868 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.435862 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.453013 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.465735 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.475643 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.475680 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.475691 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.475705 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.475714 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:35Z","lastTransitionTime":"2025-11-25T12:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.478591 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.490624 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.502299 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.515834 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.524739 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:35Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.578428 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.578469 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.578482 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.578508 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.578524 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:35Z","lastTransitionTime":"2025-11-25T12:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.682088 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.682142 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.682155 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.682176 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.682190 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:35Z","lastTransitionTime":"2025-11-25T12:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.784660 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.784688 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.784697 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.784711 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.784720 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:35Z","lastTransitionTime":"2025-11-25T12:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.887116 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.887176 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.887198 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.887218 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.887232 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:35Z","lastTransitionTime":"2025-11-25T12:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.990491 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.990537 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.990548 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.990562 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:35 crc kubenswrapper[4693]: I1125 12:08:35.990573 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:35Z","lastTransitionTime":"2025-11-25T12:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.059252 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" event={"ID":"e8271578-d9e5-4777-8689-da8dd38edfb6","Type":"ContainerStarted","Data":"2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7"} Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.059360 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.084731 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.093566 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.093622 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.093638 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.093661 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.093678 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:36Z","lastTransitionTime":"2025-11-25T12:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.108898 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.137899 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.152262 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.166581 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.178071 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.192724 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.195887 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.195923 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.195934 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.195948 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.195958 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:36Z","lastTransitionTime":"2025-11-25T12:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.204691 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.230256 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.243950 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.257615 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.275364 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.287076 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.298894 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.298932 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.298940 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.298954 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.298963 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:36Z","lastTransitionTime":"2025-11-25T12:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.304129 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.316998 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.401150 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.401177 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.401185 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.401197 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.401205 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:36Z","lastTransitionTime":"2025-11-25T12:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.432163 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.432355 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:08:52.432328952 +0000 UTC m=+52.350414333 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.451122 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.466033 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.480968 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.494020 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.503305 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.503512 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.503602 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.503704 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.503780 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:36Z","lastTransitionTime":"2025-11-25T12:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.513090 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.526202 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.533772 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.533944 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.534043 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.534128 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.533944 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.534296 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:52.534281917 +0000 UTC m=+52.452367298 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.533995 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.534483 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:52.534472982 +0000 UTC m=+52.452558363 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.534133 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.534626 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.534734 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.534191 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.534847 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.534865 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.534906 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:52.534892173 +0000 UTC m=+52.452977574 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.534992 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 12:08:52.534979546 +0000 UTC m=+52.453064927 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.538119 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.548222 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.566907 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.585639 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.601779 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.605612 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.605657 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.605670 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.605687 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.605698 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:36Z","lastTransitionTime":"2025-11-25T12:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.617420 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.630661 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.641411 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.650292 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.664870 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:36Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.707834 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.707866 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.707875 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.707887 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.707896 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:36Z","lastTransitionTime":"2025-11-25T12:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.809790 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.809832 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.809843 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.809857 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.809868 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:36Z","lastTransitionTime":"2025-11-25T12:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.812333 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.812429 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.812429 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.812533 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.812601 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:36 crc kubenswrapper[4693]: E1125 12:08:36.812656 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.913464 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.913526 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.913548 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.913580 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:36 crc kubenswrapper[4693]: I1125 12:08:36.913602 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:36Z","lastTransitionTime":"2025-11-25T12:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.016589 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.016663 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.016687 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.016717 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.016739 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.056953 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.057017 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.057028 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.057045 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.057057 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.064644 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/0.log" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.068315 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb" exitCode=1 Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.068411 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.069176 4693 scope.go:117] "RemoveContainer" containerID="67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb" Nov 25 12:08:37 crc kubenswrapper[4693]: E1125 12:08:37.077435 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.085532 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.085579 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.085600 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.085623 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.085636 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.088921 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: E1125 12:08:37.109844 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.114945 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.115189 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.115225 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.115241 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.115263 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.115275 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: E1125 12:08:37.131768 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.143755 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.143827 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.143848 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.143878 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.143910 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.145228 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: E1125 12:08:37.162974 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.166684 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.166828 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.166886 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.166946 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.166998 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.174990 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: E1125 12:08:37.184046 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: E1125 12:08:37.184196 4693 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.186057 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.186098 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.186111 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.186128 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.186146 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.188269 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.203120 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.217355 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.228654 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.248240 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.262341 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.279604 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.288410 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.288451 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.288466 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.288483 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.288497 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.294059 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.311728 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 12:08:36.744701 5960 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 12:08:36.744733 5960 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 12:08:36.744738 5960 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 12:08:36.744761 5960 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 12:08:36.744776 5960 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:36.744781 5960 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 12:08:36.745329 5960 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 12:08:36.745355 5960 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 12:08:36.745361 5960 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 12:08:36.745399 5960 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 12:08:36.745419 5960 factory.go:656] Stopping watch factory\\\\nI1125 12:08:36.745444 5960 ovnkube.go:599] Stopped ovnkube\\\\nI1125 12:08:36.745458 5960 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 12:08:36.745476 5960 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:36.745481 5960 handler.go:208] Removed *v1.Node event handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.320768 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.338926 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:37Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.391067 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.391109 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.391120 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.391137 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.391149 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.493580 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.493618 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.493626 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.493640 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.493648 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.596343 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.596412 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.596425 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.596443 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.596455 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.699056 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.699112 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.699126 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.699140 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.699149 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.801017 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.801058 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.801068 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.801082 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.801094 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.903860 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.903918 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.903935 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.903957 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:37 crc kubenswrapper[4693]: I1125 12:08:37.903973 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:37Z","lastTransitionTime":"2025-11-25T12:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.006358 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.006426 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.006435 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.006451 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.006479 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:38Z","lastTransitionTime":"2025-11-25T12:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.074408 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/0.log" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.077714 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4"} Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.077834 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.098070 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 12:08:36.744701 5960 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 12:08:36.744733 5960 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 12:08:36.744738 5960 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 12:08:36.744761 5960 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 12:08:36.744776 5960 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:36.744781 5960 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 12:08:36.745329 5960 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 12:08:36.745355 5960 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 12:08:36.745361 5960 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 12:08:36.745399 5960 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 12:08:36.745419 5960 factory.go:656] Stopping watch factory\\\\nI1125 12:08:36.745444 5960 ovnkube.go:599] Stopped ovnkube\\\\nI1125 12:08:36.745458 5960 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 12:08:36.745476 5960 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:36.745481 5960 handler.go:208] Removed *v1.Node event handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.108769 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.109404 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.109438 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.109453 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.109470 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.109482 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:38Z","lastTransitionTime":"2025-11-25T12:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.122622 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.137242 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.153499 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.168812 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.190250 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.205477 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.211807 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.211868 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.211888 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.211927 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.211959 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:38Z","lastTransitionTime":"2025-11-25T12:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.219422 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.233258 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.247764 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.266989 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.283951 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.298235 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.314264 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.314304 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.314320 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.314475 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.314497 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:38Z","lastTransitionTime":"2025-11-25T12:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.316851 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.417117 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.417458 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.417564 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.417659 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.417735 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:38Z","lastTransitionTime":"2025-11-25T12:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.521228 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.521543 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.521683 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.521862 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.521985 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:38Z","lastTransitionTime":"2025-11-25T12:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.624914 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.625194 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.625455 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.625617 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.625829 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:38Z","lastTransitionTime":"2025-11-25T12:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.649154 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf"] Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.650369 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.653655 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.653964 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.686356 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 12:08:36.744701 5960 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 12:08:36.744733 5960 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 12:08:36.744738 5960 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 12:08:36.744761 5960 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 12:08:36.744776 5960 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:36.744781 5960 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 12:08:36.745329 5960 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 12:08:36.745355 5960 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 12:08:36.745361 5960 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 12:08:36.745399 5960 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 12:08:36.745419 5960 factory.go:656] Stopping watch factory\\\\nI1125 12:08:36.745444 5960 ovnkube.go:599] Stopped ovnkube\\\\nI1125 12:08:36.745458 5960 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 12:08:36.745476 5960 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:36.745481 5960 handler.go:208] Removed *v1.Node event handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.704982 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.720014 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.729566 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.729648 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.729672 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.729701 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.729722 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:38Z","lastTransitionTime":"2025-11-25T12:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.736054 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.751751 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.755163 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0c76695-d437-4d1b-92e1-37b2b5b045f0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qt9tf\" (UID: \"d0c76695-d437-4d1b-92e1-37b2b5b045f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.755243 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0c76695-d437-4d1b-92e1-37b2b5b045f0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qt9tf\" (UID: \"d0c76695-d437-4d1b-92e1-37b2b5b045f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.755323 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0c76695-d437-4d1b-92e1-37b2b5b045f0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qt9tf\" (UID: \"d0c76695-d437-4d1b-92e1-37b2b5b045f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.755466 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnhdj\" (UniqueName: \"kubernetes.io/projected/d0c76695-d437-4d1b-92e1-37b2b5b045f0-kube-api-access-qnhdj\") pod \"ovnkube-control-plane-749d76644c-qt9tf\" (UID: \"d0c76695-d437-4d1b-92e1-37b2b5b045f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.777497 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.790895 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.812485 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:38 crc kubenswrapper[4693]: E1125 12:08:38.812601 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.812731 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.812824 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:38 crc kubenswrapper[4693]: E1125 12:08:38.812978 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:38 crc kubenswrapper[4693]: E1125 12:08:38.812991 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.824087 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.833068 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.833122 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.833140 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.833164 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.833182 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:38Z","lastTransitionTime":"2025-11-25T12:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.842494 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.855985 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnhdj\" (UniqueName: \"kubernetes.io/projected/d0c76695-d437-4d1b-92e1-37b2b5b045f0-kube-api-access-qnhdj\") pod \"ovnkube-control-plane-749d76644c-qt9tf\" (UID: \"d0c76695-d437-4d1b-92e1-37b2b5b045f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.856044 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0c76695-d437-4d1b-92e1-37b2b5b045f0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qt9tf\" (UID: \"d0c76695-d437-4d1b-92e1-37b2b5b045f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.856077 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0c76695-d437-4d1b-92e1-37b2b5b045f0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qt9tf\" (UID: \"d0c76695-d437-4d1b-92e1-37b2b5b045f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.856098 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0c76695-d437-4d1b-92e1-37b2b5b045f0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qt9tf\" (UID: \"d0c76695-d437-4d1b-92e1-37b2b5b045f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.856808 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0c76695-d437-4d1b-92e1-37b2b5b045f0-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qt9tf\" (UID: \"d0c76695-d437-4d1b-92e1-37b2b5b045f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.857028 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.857613 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0c76695-d437-4d1b-92e1-37b2b5b045f0-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qt9tf\" (UID: \"d0c76695-d437-4d1b-92e1-37b2b5b045f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.863023 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0c76695-d437-4d1b-92e1-37b2b5b045f0-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qt9tf\" (UID: \"d0c76695-d437-4d1b-92e1-37b2b5b045f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.874744 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.877503 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnhdj\" (UniqueName: \"kubernetes.io/projected/d0c76695-d437-4d1b-92e1-37b2b5b045f0-kube-api-access-qnhdj\") pod \"ovnkube-control-plane-749d76644c-qt9tf\" (UID: \"d0c76695-d437-4d1b-92e1-37b2b5b045f0\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.887198 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.897569 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.911759 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.923313 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.935244 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.935278 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.935289 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.935308 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.935320 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:38Z","lastTransitionTime":"2025-11-25T12:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.937941 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:38Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:38 crc kubenswrapper[4693]: I1125 12:08:38.973426 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" Nov 25 12:08:38 crc kubenswrapper[4693]: W1125 12:08:38.995099 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0c76695_d437_4d1b_92e1_37b2b5b045f0.slice/crio-22e0011353ff7bd57ad97527ec67cf4063ae4c90a06b8131881bf9a042def8db WatchSource:0}: Error finding container 22e0011353ff7bd57ad97527ec67cf4063ae4c90a06b8131881bf9a042def8db: Status 404 returned error can't find the container with id 22e0011353ff7bd57ad97527ec67cf4063ae4c90a06b8131881bf9a042def8db Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.037976 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.038012 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.038023 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.038039 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.038051 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:39Z","lastTransitionTime":"2025-11-25T12:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.085901 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" event={"ID":"d0c76695-d437-4d1b-92e1-37b2b5b045f0","Type":"ContainerStarted","Data":"22e0011353ff7bd57ad97527ec67cf4063ae4c90a06b8131881bf9a042def8db"} Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.087727 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/1.log" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.088432 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/0.log" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.091232 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4" exitCode=1 Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.091265 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4"} Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.091310 4693 scope.go:117] "RemoveContainer" containerID="67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.092025 4693 scope.go:117] "RemoveContainer" containerID="5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4" Nov 25 12:08:39 crc kubenswrapper[4693]: E1125 12:08:39.092218 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.106146 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.118486 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.129507 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.140315 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.140424 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.140453 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.140492 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.140504 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:39Z","lastTransitionTime":"2025-11-25T12:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.141422 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.153130 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.163430 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.173090 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.188557 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 12:08:36.744701 5960 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 12:08:36.744733 5960 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 12:08:36.744738 5960 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 12:08:36.744761 5960 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 12:08:36.744776 5960 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:36.744781 5960 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 12:08:36.745329 5960 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 12:08:36.745355 5960 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 12:08:36.745361 5960 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 12:08:36.745399 5960 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 12:08:36.745419 5960 factory.go:656] Stopping watch factory\\\\nI1125 12:08:36.745444 5960 ovnkube.go:599] Stopped ovnkube\\\\nI1125 12:08:36.745458 5960 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 12:08:36.745476 5960 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:36.745481 5960 handler.go:208] Removed *v1.Node event handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.21\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 12:08:37.894045 6149 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1125 12:08:37.894051 6149 services_controller.go:443] Built service openshift-oauth-apiserver/api LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.198187 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.213562 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.227017 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.243159 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.243222 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.243239 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.243262 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.243279 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:39Z","lastTransitionTime":"2025-11-25T12:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.246086 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.257656 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.268700 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.279075 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.293676 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:39Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.346013 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.346076 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.346094 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.346121 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.346139 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:39Z","lastTransitionTime":"2025-11-25T12:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.449080 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.449414 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.449510 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.449638 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.449781 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:39Z","lastTransitionTime":"2025-11-25T12:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.552275 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.552536 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.552741 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.552930 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.553122 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:39Z","lastTransitionTime":"2025-11-25T12:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.655753 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.655972 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.656057 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.656171 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.656283 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:39Z","lastTransitionTime":"2025-11-25T12:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.758813 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.758866 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.758882 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.758903 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.758918 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:39Z","lastTransitionTime":"2025-11-25T12:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.861734 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.861768 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.861779 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.861796 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.861808 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:39Z","lastTransitionTime":"2025-11-25T12:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.964161 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.964225 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.964245 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.964271 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:39 crc kubenswrapper[4693]: I1125 12:08:39.964293 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:39Z","lastTransitionTime":"2025-11-25T12:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.067181 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.067250 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.067267 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.067291 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.067310 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:40Z","lastTransitionTime":"2025-11-25T12:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.097119 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" event={"ID":"d0c76695-d437-4d1b-92e1-37b2b5b045f0","Type":"ContainerStarted","Data":"e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8"} Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.097206 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" event={"ID":"d0c76695-d437-4d1b-92e1-37b2b5b045f0","Type":"ContainerStarted","Data":"04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f"} Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.099599 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/1.log" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.103817 4693 scope.go:117] "RemoveContainer" containerID="5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4" Nov 25 12:08:40 crc kubenswrapper[4693]: E1125 12:08:40.104005 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.119872 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.141731 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.144010 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-n2f89"] Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.144621 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:40 crc kubenswrapper[4693]: E1125 12:08:40.144705 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.165018 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.170239 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.170290 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.170308 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.170330 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.170348 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:40Z","lastTransitionTime":"2025-11-25T12:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.186027 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.205846 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.221649 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.238502 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.265525 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67bdb24b1aa38b3a0ee6495b358f8b84125356e9045a66dc28042013430c33fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:37Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1125 12:08:36.744701 5960 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1125 12:08:36.744733 5960 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1125 12:08:36.744738 5960 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1125 12:08:36.744761 5960 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1125 12:08:36.744776 5960 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:36.744781 5960 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1125 12:08:36.745329 5960 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1125 12:08:36.745355 5960 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 12:08:36.745361 5960 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1125 12:08:36.745399 5960 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1125 12:08:36.745419 5960 factory.go:656] Stopping watch factory\\\\nI1125 12:08:36.745444 5960 ovnkube.go:599] Stopped ovnkube\\\\nI1125 12:08:36.745458 5960 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 12:08:36.745476 5960 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:36.745481 5960 handler.go:208] Removed *v1.Node event handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.21\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 12:08:37.894045 6149 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1125 12:08:37.894051 6149 services_controller.go:443] Built service openshift-oauth-apiserver/api LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.269523 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.269573 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2rjc\" (UniqueName: \"kubernetes.io/projected/a10eb19c-b500-4cf9-961d-1892ba67560a-kube-api-access-w2rjc\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.273486 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.273552 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.273638 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.273727 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.273746 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:40Z","lastTransitionTime":"2025-11-25T12:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.282349 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.301060 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.320291 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.343526 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.361038 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.371603 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2rjc\" (UniqueName: \"kubernetes.io/projected/a10eb19c-b500-4cf9-961d-1892ba67560a-kube-api-access-w2rjc\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.372361 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:40 crc kubenswrapper[4693]: E1125 12:08:40.372739 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:08:40 crc kubenswrapper[4693]: E1125 12:08:40.372872 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs podName:a10eb19c-b500-4cf9-961d-1892ba67560a nodeName:}" failed. No retries permitted until 2025-11-25 12:08:40.872831609 +0000 UTC m=+40.790917040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs") pod "network-metrics-daemon-n2f89" (UID: "a10eb19c-b500-4cf9-961d-1892ba67560a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.378834 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.380597 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.380689 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.380716 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.381193 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.382512 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:40Z","lastTransitionTime":"2025-11-25T12:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.392587 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.406409 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2rjc\" (UniqueName: \"kubernetes.io/projected/a10eb19c-b500-4cf9-961d-1892ba67560a-kube-api-access-w2rjc\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.408889 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.420792 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.437284 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.451035 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.463346 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.476195 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.485018 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.485062 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.485075 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.485094 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.485106 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:40Z","lastTransitionTime":"2025-11-25T12:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.494210 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.508125 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.536116 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.21\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 12:08:37.894045 6149 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1125 12:08:37.894051 6149 services_controller.go:443] Built service openshift-oauth-apiserver/api LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.551258 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.564334 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.583826 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.587667 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.587722 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.587739 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.587765 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.587784 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:40Z","lastTransitionTime":"2025-11-25T12:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.602917 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.622079 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.638750 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.662587 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.680944 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.690172 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.690545 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.690996 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.691278 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.691604 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:40Z","lastTransitionTime":"2025-11-25T12:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.707293 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.794109 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.794499 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.794701 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.794888 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.795029 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:40Z","lastTransitionTime":"2025-11-25T12:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.811982 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.812209 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:40 crc kubenswrapper[4693]: E1125 12:08:40.812551 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.812584 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:40 crc kubenswrapper[4693]: E1125 12:08:40.812935 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:40 crc kubenswrapper[4693]: E1125 12:08:40.812677 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.830339 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.850810 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.870979 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.878203 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:40 crc kubenswrapper[4693]: E1125 12:08:40.878408 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:08:40 crc kubenswrapper[4693]: E1125 12:08:40.878481 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs podName:a10eb19c-b500-4cf9-961d-1892ba67560a nodeName:}" failed. No retries permitted until 2025-11-25 12:08:41.878460614 +0000 UTC m=+41.796546035 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs") pod "network-metrics-daemon-n2f89" (UID: "a10eb19c-b500-4cf9-961d-1892ba67560a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.894482 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.898124 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.898174 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.898186 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.898209 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.898218 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:40Z","lastTransitionTime":"2025-11-25T12:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.913264 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.930260 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.947270 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.966329 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:40 crc kubenswrapper[4693]: I1125 12:08:40.987917 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:40Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.004648 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:41Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.006845 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.006879 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.006890 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.006904 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.006914 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:41Z","lastTransitionTime":"2025-11-25T12:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.024488 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:41Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.039641 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:41Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.052529 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:41Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.063059 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:41Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.077396 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:41Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.107096 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.21\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 12:08:37.894045 6149 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1125 12:08:37.894051 6149 services_controller.go:443] Built service openshift-oauth-apiserver/api LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:41Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.109609 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.109659 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.109677 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.109702 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.109717 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:41Z","lastTransitionTime":"2025-11-25T12:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.122047 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:41Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.212811 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.212876 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.212893 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.212917 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.212935 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:41Z","lastTransitionTime":"2025-11-25T12:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.316698 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.316781 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.316809 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.316841 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.316866 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:41Z","lastTransitionTime":"2025-11-25T12:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.419318 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.419392 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.419408 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.419426 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.419441 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:41Z","lastTransitionTime":"2025-11-25T12:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.522762 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.523079 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.523095 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.523118 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.523135 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:41Z","lastTransitionTime":"2025-11-25T12:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.626825 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.626874 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.626891 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.626914 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.626932 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:41Z","lastTransitionTime":"2025-11-25T12:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.730787 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.730848 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.730865 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.730887 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.730903 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:41Z","lastTransitionTime":"2025-11-25T12:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.812546 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:41 crc kubenswrapper[4693]: E1125 12:08:41.812776 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.833841 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.833910 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.833936 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.833960 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.833984 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:41Z","lastTransitionTime":"2025-11-25T12:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.890897 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:41 crc kubenswrapper[4693]: E1125 12:08:41.891068 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:08:41 crc kubenswrapper[4693]: E1125 12:08:41.891119 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs podName:a10eb19c-b500-4cf9-961d-1892ba67560a nodeName:}" failed. No retries permitted until 2025-11-25 12:08:43.891103866 +0000 UTC m=+43.809189257 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs") pod "network-metrics-daemon-n2f89" (UID: "a10eb19c-b500-4cf9-961d-1892ba67560a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.937114 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.937155 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.937167 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.937183 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:41 crc kubenswrapper[4693]: I1125 12:08:41.937198 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:41Z","lastTransitionTime":"2025-11-25T12:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.040213 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.040328 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.040350 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.040401 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.040421 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:42Z","lastTransitionTime":"2025-11-25T12:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.143941 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.144007 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.144024 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.144049 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.144066 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:42Z","lastTransitionTime":"2025-11-25T12:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.251921 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.251983 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.252072 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.252280 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.252444 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:42Z","lastTransitionTime":"2025-11-25T12:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.356002 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.356056 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.356091 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.356120 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.356142 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:42Z","lastTransitionTime":"2025-11-25T12:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.458522 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.458605 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.458638 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.458668 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.458688 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:42Z","lastTransitionTime":"2025-11-25T12:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.562233 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.562356 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.562436 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.562468 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.562491 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:42Z","lastTransitionTime":"2025-11-25T12:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.665748 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.665814 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.665831 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.665862 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.665879 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:42Z","lastTransitionTime":"2025-11-25T12:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.769237 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.769280 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.769291 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.769308 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.769319 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:42Z","lastTransitionTime":"2025-11-25T12:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.812455 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:42 crc kubenswrapper[4693]: E1125 12:08:42.812583 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.812461 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.812680 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:42 crc kubenswrapper[4693]: E1125 12:08:42.812805 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:42 crc kubenswrapper[4693]: E1125 12:08:42.812950 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.872594 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.872656 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.872672 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.872699 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.872721 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:42Z","lastTransitionTime":"2025-11-25T12:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.976398 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.976500 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.976518 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.976576 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:42 crc kubenswrapper[4693]: I1125 12:08:42.976594 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:42Z","lastTransitionTime":"2025-11-25T12:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.081078 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.081749 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.081825 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.081851 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.081869 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:43Z","lastTransitionTime":"2025-11-25T12:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.185774 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.185839 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.185857 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.185880 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.185898 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:43Z","lastTransitionTime":"2025-11-25T12:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.289302 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.289359 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.289415 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.289443 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.289506 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:43Z","lastTransitionTime":"2025-11-25T12:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.404996 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.405038 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.405047 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.405078 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.405088 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:43Z","lastTransitionTime":"2025-11-25T12:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.508427 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.508510 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.508539 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.508568 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.508589 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:43Z","lastTransitionTime":"2025-11-25T12:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.611978 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.612056 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.612073 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.612102 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.612117 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:43Z","lastTransitionTime":"2025-11-25T12:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.714616 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.714654 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.714676 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.714695 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.714708 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:43Z","lastTransitionTime":"2025-11-25T12:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.812025 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:43 crc kubenswrapper[4693]: E1125 12:08:43.812197 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.817837 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.817923 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.817935 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.817949 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.817962 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:43Z","lastTransitionTime":"2025-11-25T12:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.913258 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:43 crc kubenswrapper[4693]: E1125 12:08:43.913424 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:08:43 crc kubenswrapper[4693]: E1125 12:08:43.913473 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs podName:a10eb19c-b500-4cf9-961d-1892ba67560a nodeName:}" failed. No retries permitted until 2025-11-25 12:08:47.913459343 +0000 UTC m=+47.831544724 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs") pod "network-metrics-daemon-n2f89" (UID: "a10eb19c-b500-4cf9-961d-1892ba67560a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.921051 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.921127 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.921150 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.921179 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:43 crc kubenswrapper[4693]: I1125 12:08:43.921202 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:43Z","lastTransitionTime":"2025-11-25T12:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.024758 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.024831 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.024855 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.024886 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.024908 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:44Z","lastTransitionTime":"2025-11-25T12:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.128091 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.128131 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.128156 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.128175 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.128190 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:44Z","lastTransitionTime":"2025-11-25T12:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.230741 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.230788 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.230801 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.230819 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.230830 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:44Z","lastTransitionTime":"2025-11-25T12:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.332959 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.333023 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.333035 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.333051 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.333063 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:44Z","lastTransitionTime":"2025-11-25T12:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.434769 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.434807 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.434817 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.434831 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.434842 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:44Z","lastTransitionTime":"2025-11-25T12:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.538024 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.538102 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.538136 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.538167 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.538188 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:44Z","lastTransitionTime":"2025-11-25T12:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.640257 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.640298 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.640308 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.640324 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.640337 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:44Z","lastTransitionTime":"2025-11-25T12:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.747778 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.748039 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.748339 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.748361 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.748428 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:44Z","lastTransitionTime":"2025-11-25T12:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.812204 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.812264 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:44 crc kubenswrapper[4693]: E1125 12:08:44.812324 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.812208 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:44 crc kubenswrapper[4693]: E1125 12:08:44.812709 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:44 crc kubenswrapper[4693]: E1125 12:08:44.812807 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.854889 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.854959 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.854986 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.855016 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.855039 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:44Z","lastTransitionTime":"2025-11-25T12:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.957768 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.957810 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.957824 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.957851 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:44 crc kubenswrapper[4693]: I1125 12:08:44.957867 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:44Z","lastTransitionTime":"2025-11-25T12:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.060866 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.060929 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.060963 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.060989 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.061001 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:45Z","lastTransitionTime":"2025-11-25T12:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.163066 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.163114 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.163131 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.163160 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.163181 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:45Z","lastTransitionTime":"2025-11-25T12:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.266538 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.266615 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.266636 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.266661 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.266678 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:45Z","lastTransitionTime":"2025-11-25T12:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.368700 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.368756 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.368774 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.368797 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.368814 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:45Z","lastTransitionTime":"2025-11-25T12:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.471207 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.471263 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.471280 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.471305 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.471323 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:45Z","lastTransitionTime":"2025-11-25T12:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.574630 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.574682 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.574699 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.574725 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.574747 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:45Z","lastTransitionTime":"2025-11-25T12:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.677653 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.677950 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.678081 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.678183 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.678276 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:45Z","lastTransitionTime":"2025-11-25T12:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.781471 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.781526 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.781559 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.781583 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.781601 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:45Z","lastTransitionTime":"2025-11-25T12:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.812317 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:45 crc kubenswrapper[4693]: E1125 12:08:45.812611 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.884606 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.884661 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.884677 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.884702 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.884721 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:45Z","lastTransitionTime":"2025-11-25T12:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.987835 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.988184 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.988348 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.988591 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:45 crc kubenswrapper[4693]: I1125 12:08:45.988760 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:45Z","lastTransitionTime":"2025-11-25T12:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.091747 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.091788 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.091799 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.091814 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.091825 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:46Z","lastTransitionTime":"2025-11-25T12:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.194415 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.194467 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.194479 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.194511 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.194535 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:46Z","lastTransitionTime":"2025-11-25T12:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.297612 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.297672 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.297691 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.297714 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.297732 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:46Z","lastTransitionTime":"2025-11-25T12:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.400707 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.400768 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.400785 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.400808 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.400826 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:46Z","lastTransitionTime":"2025-11-25T12:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.503197 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.503252 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.503268 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.503291 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.503308 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:46Z","lastTransitionTime":"2025-11-25T12:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.606899 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.606969 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.606994 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.607019 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.607037 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:46Z","lastTransitionTime":"2025-11-25T12:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.709850 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.709934 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.709976 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.710009 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.710032 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:46Z","lastTransitionTime":"2025-11-25T12:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.812349 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.812539 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:46 crc kubenswrapper[4693]: E1125 12:08:46.812555 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:46 crc kubenswrapper[4693]: E1125 12:08:46.813012 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.813302 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.813394 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.813427 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.813443 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.813466 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.813484 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:46Z","lastTransitionTime":"2025-11-25T12:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:46 crc kubenswrapper[4693]: E1125 12:08:46.813490 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.916430 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.916511 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.916535 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.916559 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:46 crc kubenswrapper[4693]: I1125 12:08:46.916576 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:46Z","lastTransitionTime":"2025-11-25T12:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.019733 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.020101 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.020147 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.020181 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.020218 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.124658 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.124723 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.124740 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.124768 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.124789 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.227412 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.227461 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.227475 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.227493 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.227514 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.330337 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.330417 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.330434 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.330451 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.330463 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.433212 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.433263 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.433273 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.433289 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.433301 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.478785 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.478836 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.478848 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.478869 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.478881 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: E1125 12:08:47.493047 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:47Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.496061 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.496193 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.496276 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.496396 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.496494 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: E1125 12:08:47.509585 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:47Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.513180 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.513322 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.513415 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.513571 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.513615 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: E1125 12:08:47.526932 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:47Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.531421 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.531494 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.531508 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.531522 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.531531 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: E1125 12:08:47.543451 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:47Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.546776 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.546804 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.546815 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.546831 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.546842 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: E1125 12:08:47.558772 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:47Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:47 crc kubenswrapper[4693]: E1125 12:08:47.558922 4693 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.560829 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.560858 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.560869 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.560886 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.560898 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.664092 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.664140 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.664151 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.664166 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.664175 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.765955 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.766001 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.766011 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.766023 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.766031 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.812119 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:47 crc kubenswrapper[4693]: E1125 12:08:47.812289 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.867765 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.867807 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.867816 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.867831 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.867842 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.957349 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:47 crc kubenswrapper[4693]: E1125 12:08:47.957525 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:08:47 crc kubenswrapper[4693]: E1125 12:08:47.957593 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs podName:a10eb19c-b500-4cf9-961d-1892ba67560a nodeName:}" failed. No retries permitted until 2025-11-25 12:08:55.957578183 +0000 UTC m=+55.875663564 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs") pod "network-metrics-daemon-n2f89" (UID: "a10eb19c-b500-4cf9-961d-1892ba67560a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.970215 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.970250 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.970261 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.970277 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:47 crc kubenswrapper[4693]: I1125 12:08:47.970288 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:47Z","lastTransitionTime":"2025-11-25T12:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.072925 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.072997 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.073013 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.073029 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.073055 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:48Z","lastTransitionTime":"2025-11-25T12:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.175663 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.175984 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.176113 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.176216 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.176460 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:48Z","lastTransitionTime":"2025-11-25T12:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.279266 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.279299 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.279308 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.279324 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.279335 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:48Z","lastTransitionTime":"2025-11-25T12:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.333334 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.334514 4693 scope.go:117] "RemoveContainer" containerID="5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.381472 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.381501 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.381510 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.381523 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.381532 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:48Z","lastTransitionTime":"2025-11-25T12:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.484113 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.484558 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.484571 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.484596 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.484612 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:48Z","lastTransitionTime":"2025-11-25T12:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.586697 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.586738 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.586751 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.586772 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.586785 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:48Z","lastTransitionTime":"2025-11-25T12:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.689518 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.689581 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.689596 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.689621 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.689646 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:48Z","lastTransitionTime":"2025-11-25T12:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.792642 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.792700 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.792710 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.792731 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.792745 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:48Z","lastTransitionTime":"2025-11-25T12:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.812226 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.812278 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.812339 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:48 crc kubenswrapper[4693]: E1125 12:08:48.812460 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:48 crc kubenswrapper[4693]: E1125 12:08:48.812628 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:48 crc kubenswrapper[4693]: E1125 12:08:48.812718 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.895192 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.895253 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.895267 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.895286 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.895307 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:48Z","lastTransitionTime":"2025-11-25T12:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.997837 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.997872 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.997879 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.997892 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:48 crc kubenswrapper[4693]: I1125 12:08:48.997901 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:48Z","lastTransitionTime":"2025-11-25T12:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.100755 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.100790 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.100798 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.100813 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.100821 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:49Z","lastTransitionTime":"2025-11-25T12:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.135078 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/1.log" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.137606 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4"} Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.138679 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.154018 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.167552 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.179759 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.190539 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.200794 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.203468 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.203502 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.203515 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.203535 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.203549 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:49Z","lastTransitionTime":"2025-11-25T12:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.214708 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.229151 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.254202 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.21\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 12:08:37.894045 6149 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1125 12:08:37.894051 6149 services_controller.go:443] Built service openshift-oauth-apiserver/api LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.268790 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.281650 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.297073 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.306283 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.306316 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.306328 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.306348 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.306360 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:49Z","lastTransitionTime":"2025-11-25T12:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.310675 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.327961 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.342410 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.355289 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.368841 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.386807 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:49Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.408894 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.408946 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.408957 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.408977 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.408990 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:49Z","lastTransitionTime":"2025-11-25T12:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.510806 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.510852 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.510863 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.510879 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.510889 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:49Z","lastTransitionTime":"2025-11-25T12:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.614456 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.614564 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.614582 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.614646 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.614665 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:49Z","lastTransitionTime":"2025-11-25T12:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.719097 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.719157 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.719172 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.719197 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.719214 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:49Z","lastTransitionTime":"2025-11-25T12:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.811763 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:49 crc kubenswrapper[4693]: E1125 12:08:49.811959 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.827714 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.827792 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.827813 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.827841 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.827865 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:49Z","lastTransitionTime":"2025-11-25T12:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.931243 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.931319 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.931343 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.931404 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:49 crc kubenswrapper[4693]: I1125 12:08:49.931430 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:49Z","lastTransitionTime":"2025-11-25T12:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.035061 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.035121 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.035144 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.035170 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.035188 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:50Z","lastTransitionTime":"2025-11-25T12:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.137569 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.137630 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.137647 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.137669 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.137688 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:50Z","lastTransitionTime":"2025-11-25T12:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.143307 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/2.log" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.144369 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/1.log" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.148677 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4" exitCode=1 Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.148734 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4"} Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.148811 4693 scope.go:117] "RemoveContainer" containerID="5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.153241 4693 scope.go:117] "RemoveContainer" containerID="c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4" Nov 25 12:08:50 crc kubenswrapper[4693]: E1125 12:08:50.153955 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.186982 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.21\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 12:08:37.894045 6149 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1125 12:08:37.894051 6149 services_controller.go:443] Built service openshift-oauth-apiserver/api LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:49Z\\\",\\\"message\\\":\\\"p[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 12:08:49.240312 6339 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 12:08:49.240319 6339 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:49.240327 6339 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:49.240283 6339 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm in node crc\\\\nI1125 12:08:49.240352 6339 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm after 0 failed attempt(s)\\\\nI1125 12:08:49.240359 6339 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sn9jm\\\\nI1125 12:08:49.238439 6339 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 3.112868ms\\\\nF1125 12:08:49.240387 6339 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.204148 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.228690 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.239917 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.239952 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.239965 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.239985 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.239998 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:50Z","lastTransitionTime":"2025-11-25T12:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.245511 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.260408 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.274968 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.292787 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.312738 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.341824 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.342524 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.342583 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.342597 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.342615 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.342628 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:50Z","lastTransitionTime":"2025-11-25T12:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.355658 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.367673 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.386324 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.403014 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.418126 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.439326 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.444823 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.444866 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.444877 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.444893 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.444906 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:50Z","lastTransitionTime":"2025-11-25T12:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.454731 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.468641 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.547517 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.547605 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.547628 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.547653 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.547672 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:50Z","lastTransitionTime":"2025-11-25T12:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.651142 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.651211 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.651224 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.651240 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.651253 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:50Z","lastTransitionTime":"2025-11-25T12:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.754436 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.754499 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.754520 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.754552 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.754575 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:50Z","lastTransitionTime":"2025-11-25T12:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.811950 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.811973 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.812075 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:50 crc kubenswrapper[4693]: E1125 12:08:50.812078 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:50 crc kubenswrapper[4693]: E1125 12:08:50.812204 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:50 crc kubenswrapper[4693]: E1125 12:08:50.812283 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.830784 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.850645 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.856987 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.857019 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.857031 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.857048 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.857061 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:50Z","lastTransitionTime":"2025-11-25T12:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.873050 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.889822 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.924661 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.953738 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.959989 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.960054 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.960076 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.960105 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.960145 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:50Z","lastTransitionTime":"2025-11-25T12:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.972880 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:50 crc kubenswrapper[4693]: I1125 12:08:50.989222 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.001964 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:50Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.019114 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.033041 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.045667 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.062941 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.063005 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.063024 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.063050 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.063068 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:51Z","lastTransitionTime":"2025-11-25T12:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.063770 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.080488 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.097702 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.119949 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5fe89ad761133dd0644465fa95e81a3ecd07f49a3c0f0af0925a2a2a85f04ea4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"message\\\":\\\"alse, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.21\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1125 12:08:37.894045 6149 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}\\\\nI1125 12:08:37.894051 6149 services_controller.go:443] Built service openshift-oauth-apiserver/api LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.140\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.l\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:49Z\\\",\\\"message\\\":\\\"p[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 12:08:49.240312 6339 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 12:08:49.240319 6339 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:49.240327 6339 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:49.240283 6339 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm in node crc\\\\nI1125 12:08:49.240352 6339 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm after 0 failed attempt(s)\\\\nI1125 12:08:49.240359 6339 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sn9jm\\\\nI1125 12:08:49.238439 6339 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 3.112868ms\\\\nF1125 12:08:49.240387 6339 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.132908 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.153194 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/2.log" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.156664 4693 scope.go:117] "RemoveContainer" containerID="c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4" Nov 25 12:08:51 crc kubenswrapper[4693]: E1125 12:08:51.156805 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.165136 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.165159 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.165167 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.165182 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.165194 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:51Z","lastTransitionTime":"2025-11-25T12:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.169187 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.184583 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.197120 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.208766 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.222085 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.239489 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.253361 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.264921 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.268563 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.268617 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.268629 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.268653 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.268665 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:51Z","lastTransitionTime":"2025-11-25T12:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.278537 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.292445 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.304778 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.317967 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.329714 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.350437 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:49Z\\\",\\\"message\\\":\\\"p[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 12:08:49.240312 6339 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 12:08:49.240319 6339 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:49.240327 6339 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:49.240283 6339 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm in node crc\\\\nI1125 12:08:49.240352 6339 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm after 0 failed attempt(s)\\\\nI1125 12:08:49.240359 6339 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sn9jm\\\\nI1125 12:08:49.238439 6339 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 3.112868ms\\\\nF1125 12:08:49.240387 6339 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.363330 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.370953 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.370985 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.370997 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.371014 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.371024 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:51Z","lastTransitionTime":"2025-11-25T12:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.377396 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.389160 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:51Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.474756 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.474797 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.474808 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.474825 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.474838 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:51Z","lastTransitionTime":"2025-11-25T12:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.579254 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.579367 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.579410 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.579434 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.579451 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:51Z","lastTransitionTime":"2025-11-25T12:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.682881 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.682946 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.682963 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.682988 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.683006 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:51Z","lastTransitionTime":"2025-11-25T12:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.786088 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.786173 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.786191 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.786216 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.786235 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:51Z","lastTransitionTime":"2025-11-25T12:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.812785 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:51 crc kubenswrapper[4693]: E1125 12:08:51.812980 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.889739 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.889788 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.889799 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.889814 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.889825 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:51Z","lastTransitionTime":"2025-11-25T12:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.993145 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.993223 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.993239 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.993256 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:51 crc kubenswrapper[4693]: I1125 12:08:51.993269 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:51Z","lastTransitionTime":"2025-11-25T12:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.095254 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.095420 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.095443 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.095467 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.095486 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:52Z","lastTransitionTime":"2025-11-25T12:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.197900 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.197938 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.197948 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.197962 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.197972 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:52Z","lastTransitionTime":"2025-11-25T12:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.300877 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.300940 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.300953 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.300969 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.300979 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:52Z","lastTransitionTime":"2025-11-25T12:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.404705 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.404774 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.404798 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.404836 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.404859 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:52Z","lastTransitionTime":"2025-11-25T12:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.507539 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.507593 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.507604 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.507621 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.507637 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:52Z","lastTransitionTime":"2025-11-25T12:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.510501 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.510832 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:09:24.510767477 +0000 UTC m=+84.428852858 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.610990 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.611075 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.611093 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.611119 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.611136 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:52Z","lastTransitionTime":"2025-11-25T12:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.611617 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.611690 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.611725 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.611754 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.611920 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.611922 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.611956 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.611970 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.611996 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.611998 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.612023 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.611929 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.612069 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 12:09:24.612043213 +0000 UTC m=+84.530128624 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.612154 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:09:24.612125025 +0000 UTC m=+84.530210436 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.612194 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 12:09:24.612179756 +0000 UTC m=+84.530265177 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.612224 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:09:24.612213917 +0000 UTC m=+84.530299328 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.713706 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.713747 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.713761 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.713778 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.713790 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:52Z","lastTransitionTime":"2025-11-25T12:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.742886 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.754836 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.756631 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.775495 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:49Z\\\",\\\"message\\\":\\\"p[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 12:08:49.240312 6339 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 12:08:49.240319 6339 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:49.240327 6339 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:49.240283 6339 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm in node crc\\\\nI1125 12:08:49.240352 6339 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm after 0 failed attempt(s)\\\\nI1125 12:08:49.240359 6339 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sn9jm\\\\nI1125 12:08:49.238439 6339 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 3.112868ms\\\\nF1125 12:08:49.240387 6339 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.784957 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.796128 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.811910 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.811921 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.811999 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.812065 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.812177 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:52 crc kubenswrapper[4693]: E1125 12:08:52.812299 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.815133 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.815701 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.815731 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.815740 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.815772 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.815782 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:52Z","lastTransitionTime":"2025-11-25T12:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.833800 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.844922 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.859142 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.900099 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.917582 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.917616 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.917624 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.917638 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.917650 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:52Z","lastTransitionTime":"2025-11-25T12:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.929927 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.946225 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.964089 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.975517 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:52 crc kubenswrapper[4693]: I1125 12:08:52.989986 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.001824 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:52Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.013356 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:53Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.021157 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.021228 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.021248 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.021274 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.021304 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:53Z","lastTransitionTime":"2025-11-25T12:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.032188 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:53Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.124962 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.125056 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.125081 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.125112 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.125136 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:53Z","lastTransitionTime":"2025-11-25T12:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.228842 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.228954 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.229025 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.229061 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.229128 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:53Z","lastTransitionTime":"2025-11-25T12:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.332408 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.332455 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.332463 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.332478 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.332489 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:53Z","lastTransitionTime":"2025-11-25T12:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.434713 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.434747 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.434759 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.434772 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.434783 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:53Z","lastTransitionTime":"2025-11-25T12:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.537181 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.537232 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.537242 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.537259 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.537270 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:53Z","lastTransitionTime":"2025-11-25T12:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.640302 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.640356 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.640409 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.640434 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.640451 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:53Z","lastTransitionTime":"2025-11-25T12:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.743156 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.743216 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.743235 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.743258 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.743277 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:53Z","lastTransitionTime":"2025-11-25T12:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.812030 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:53 crc kubenswrapper[4693]: E1125 12:08:53.812233 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.846619 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.846675 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.846686 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.846700 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.846708 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:53Z","lastTransitionTime":"2025-11-25T12:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.948855 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.948900 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.948916 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.948938 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:53 crc kubenswrapper[4693]: I1125 12:08:53.948953 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:53Z","lastTransitionTime":"2025-11-25T12:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.051529 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.051583 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.051601 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.051622 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.051640 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:54Z","lastTransitionTime":"2025-11-25T12:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.154648 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.154705 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.154717 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.154732 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.154744 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:54Z","lastTransitionTime":"2025-11-25T12:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.256577 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.256628 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.256641 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.256662 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.256693 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:54Z","lastTransitionTime":"2025-11-25T12:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.358854 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.358897 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.358908 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.358923 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.358933 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:54Z","lastTransitionTime":"2025-11-25T12:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.461926 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.461961 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.461972 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.461985 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.461994 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:54Z","lastTransitionTime":"2025-11-25T12:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.565089 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.565163 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.565176 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.565201 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.565215 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:54Z","lastTransitionTime":"2025-11-25T12:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.667486 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.667525 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.667545 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.667559 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.667570 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:54Z","lastTransitionTime":"2025-11-25T12:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.770601 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.770644 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.770655 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.770670 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.770683 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:54Z","lastTransitionTime":"2025-11-25T12:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.812515 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.812569 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.812515 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:54 crc kubenswrapper[4693]: E1125 12:08:54.812651 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:54 crc kubenswrapper[4693]: E1125 12:08:54.812753 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:54 crc kubenswrapper[4693]: E1125 12:08:54.812863 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.873863 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.873911 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.873925 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.873944 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.873957 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:54Z","lastTransitionTime":"2025-11-25T12:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.976314 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.976408 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.976426 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.976446 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:54 crc kubenswrapper[4693]: I1125 12:08:54.976461 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:54Z","lastTransitionTime":"2025-11-25T12:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.079890 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.079949 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.079963 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.079983 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.079996 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:55Z","lastTransitionTime":"2025-11-25T12:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.183465 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.183537 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.183554 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.183580 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.183601 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:55Z","lastTransitionTime":"2025-11-25T12:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.287183 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.287256 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.287274 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.287299 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.287317 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:55Z","lastTransitionTime":"2025-11-25T12:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.390954 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.391016 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.391033 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.391057 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.391079 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:55Z","lastTransitionTime":"2025-11-25T12:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.494456 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.494532 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.494560 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.494583 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.494602 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:55Z","lastTransitionTime":"2025-11-25T12:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.598671 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.598733 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.598744 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.598765 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.598777 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:55Z","lastTransitionTime":"2025-11-25T12:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.702870 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.702939 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.702961 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.702991 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.703014 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:55Z","lastTransitionTime":"2025-11-25T12:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.805938 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.806031 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.806049 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.806073 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.806091 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:55Z","lastTransitionTime":"2025-11-25T12:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.812292 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:55 crc kubenswrapper[4693]: E1125 12:08:55.812491 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.908948 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.909098 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.909118 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.909152 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:55 crc kubenswrapper[4693]: I1125 12:08:55.909170 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:55Z","lastTransitionTime":"2025-11-25T12:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.011893 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.011959 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.011977 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.012002 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.012022 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:56Z","lastTransitionTime":"2025-11-25T12:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.050644 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:56 crc kubenswrapper[4693]: E1125 12:08:56.050839 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:08:56 crc kubenswrapper[4693]: E1125 12:08:56.050918 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs podName:a10eb19c-b500-4cf9-961d-1892ba67560a nodeName:}" failed. No retries permitted until 2025-11-25 12:09:12.050895639 +0000 UTC m=+71.968981060 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs") pod "network-metrics-daemon-n2f89" (UID: "a10eb19c-b500-4cf9-961d-1892ba67560a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.114449 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.114517 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.114537 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.114565 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.114583 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:56Z","lastTransitionTime":"2025-11-25T12:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.217304 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.217338 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.217349 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.217363 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.217390 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:56Z","lastTransitionTime":"2025-11-25T12:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.320406 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.320489 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.320515 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.320546 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.320569 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:56Z","lastTransitionTime":"2025-11-25T12:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.423238 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.423267 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.423276 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.423288 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.423298 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:56Z","lastTransitionTime":"2025-11-25T12:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.525165 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.525202 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.525210 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.525222 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.525238 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:56Z","lastTransitionTime":"2025-11-25T12:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.628168 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.628215 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.628228 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.628252 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.628265 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:56Z","lastTransitionTime":"2025-11-25T12:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.731783 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.731871 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.731903 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.731935 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.731958 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:56Z","lastTransitionTime":"2025-11-25T12:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.812134 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.812228 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.812248 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:56 crc kubenswrapper[4693]: E1125 12:08:56.812256 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:56 crc kubenswrapper[4693]: E1125 12:08:56.812424 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:56 crc kubenswrapper[4693]: E1125 12:08:56.812518 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.834348 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.834431 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.834442 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.834460 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.834472 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:56Z","lastTransitionTime":"2025-11-25T12:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.937034 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.937093 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.937105 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.937128 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:56 crc kubenswrapper[4693]: I1125 12:08:56.937154 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:56Z","lastTransitionTime":"2025-11-25T12:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.040465 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.040537 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.040551 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.040580 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.040597 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.144258 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.144326 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.144338 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.144361 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.144417 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.247609 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.247665 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.247674 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.247694 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.247706 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.350517 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.350590 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.350602 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.350622 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.350640 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.453075 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.453145 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.453154 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.453167 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.453177 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.556246 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.556304 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.556314 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.556333 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.556351 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.658892 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.658933 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.658943 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.658958 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.658969 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.761948 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.762005 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.762020 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.762041 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.762057 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.812328 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:57 crc kubenswrapper[4693]: E1125 12:08:57.812634 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.865387 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.865426 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.865435 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.865449 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.865460 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.926937 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.926980 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.926995 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.927054 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.927085 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: E1125 12:08:57.940636 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:57Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.948666 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.948726 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.948743 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.948768 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.948786 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: E1125 12:08:57.970222 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:57Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.974952 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.974983 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.974991 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.975004 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.975012 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:57 crc kubenswrapper[4693]: E1125 12:08:57.992617 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:57Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.997978 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.998005 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.998013 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.998028 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:57 crc kubenswrapper[4693]: I1125 12:08:57.998037 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:57Z","lastTransitionTime":"2025-11-25T12:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:58 crc kubenswrapper[4693]: E1125 12:08:58.016025 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:58Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.021031 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.021080 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.021098 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.021120 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.021138 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:58Z","lastTransitionTime":"2025-11-25T12:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:58 crc kubenswrapper[4693]: E1125 12:08:58.040968 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:08:58Z is after 2025-08-24T17:21:41Z" Nov 25 12:08:58 crc kubenswrapper[4693]: E1125 12:08:58.041466 4693 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.046094 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.046220 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.046339 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.046473 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.046591 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:58Z","lastTransitionTime":"2025-11-25T12:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.149808 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.150032 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.150125 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.150219 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.150304 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:58Z","lastTransitionTime":"2025-11-25T12:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.252879 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.252926 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.252943 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.252966 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.252983 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:58Z","lastTransitionTime":"2025-11-25T12:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.357142 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.357534 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.357705 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.357885 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.358222 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:58Z","lastTransitionTime":"2025-11-25T12:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.461346 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.461611 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.461704 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.461800 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.461890 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:58Z","lastTransitionTime":"2025-11-25T12:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.565149 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.565189 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.565199 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.565221 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.565233 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:58Z","lastTransitionTime":"2025-11-25T12:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.668667 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.668721 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.668731 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.668746 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.668756 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:58Z","lastTransitionTime":"2025-11-25T12:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.772214 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.772284 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.772303 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.772335 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.772357 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:58Z","lastTransitionTime":"2025-11-25T12:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.812155 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.812208 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:08:58 crc kubenswrapper[4693]: E1125 12:08:58.812296 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.812348 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:08:58 crc kubenswrapper[4693]: E1125 12:08:58.812574 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:08:58 crc kubenswrapper[4693]: E1125 12:08:58.812641 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.875719 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.875795 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.875848 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.875873 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.875890 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:58Z","lastTransitionTime":"2025-11-25T12:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.985875 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.985933 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.985951 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.985975 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:58 crc kubenswrapper[4693]: I1125 12:08:58.985992 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:58Z","lastTransitionTime":"2025-11-25T12:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.088821 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.088911 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.088930 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.088951 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.088967 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:59Z","lastTransitionTime":"2025-11-25T12:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.191479 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.191544 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.191554 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.191569 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.191580 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:59Z","lastTransitionTime":"2025-11-25T12:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.294234 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.294342 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.294365 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.294449 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.294470 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:59Z","lastTransitionTime":"2025-11-25T12:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.397924 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.398023 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.398042 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.398096 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.398115 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:59Z","lastTransitionTime":"2025-11-25T12:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.501323 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.501420 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.501444 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.501474 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.501497 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:59Z","lastTransitionTime":"2025-11-25T12:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.603989 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.604033 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.604044 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.604062 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.604073 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:59Z","lastTransitionTime":"2025-11-25T12:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.706615 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.706701 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.706719 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.706742 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.706760 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:59Z","lastTransitionTime":"2025-11-25T12:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.809684 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.809771 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.809799 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.809831 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.809855 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:59Z","lastTransitionTime":"2025-11-25T12:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.812259 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:08:59 crc kubenswrapper[4693]: E1125 12:08:59.812513 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.913119 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.913171 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.913188 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.913209 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:08:59 crc kubenswrapper[4693]: I1125 12:08:59.913225 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:08:59Z","lastTransitionTime":"2025-11-25T12:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.016042 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.016091 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.016102 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.016144 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.016158 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:00Z","lastTransitionTime":"2025-11-25T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.120104 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.120176 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.120197 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.120226 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.120246 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:00Z","lastTransitionTime":"2025-11-25T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.223558 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.223632 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.223655 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.223685 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.223709 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:00Z","lastTransitionTime":"2025-11-25T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.326179 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.326242 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.326257 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.326281 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.326294 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:00Z","lastTransitionTime":"2025-11-25T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.429156 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.429227 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.429251 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.429281 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.429307 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:00Z","lastTransitionTime":"2025-11-25T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.531816 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.531856 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.531866 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.531881 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.531895 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:00Z","lastTransitionTime":"2025-11-25T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.635222 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.635291 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.635315 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.635346 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.635446 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:00Z","lastTransitionTime":"2025-11-25T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.738505 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.738599 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.738616 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.738645 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.738667 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:00Z","lastTransitionTime":"2025-11-25T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.812662 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.812678 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:00 crc kubenswrapper[4693]: E1125 12:09:00.812856 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.812900 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:00 crc kubenswrapper[4693]: E1125 12:09:00.813016 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:00 crc kubenswrapper[4693]: E1125 12:09:00.813334 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.840927 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.840985 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.841000 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.841021 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.841038 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:00Z","lastTransitionTime":"2025-11-25T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.842004 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:49Z\\\",\\\"message\\\":\\\"p[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 12:08:49.240312 6339 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 12:08:49.240319 6339 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:49.240327 6339 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:49.240283 6339 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm in node crc\\\\nI1125 12:08:49.240352 6339 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm after 0 failed attempt(s)\\\\nI1125 12:08:49.240359 6339 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sn9jm\\\\nI1125 12:08:49.238439 6339 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 3.112868ms\\\\nF1125 12:08:49.240387 6339 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:00Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.856431 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:00Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.871725 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0d8f418-ddc7-48e4-9d11-4567bc98232e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aad6fba70273e809f50e6a66fbb6fa507e315cef0b0c2b0fb6c635e306928d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e72b10217a3f81425d1f0243df0f4a40ab73aba0e06403d000bfacf0b6a6ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08df000c8fd24ff58f995a96bc5bf8e665130996de15cf6d139575dcb8284002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:00Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.887314 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:00Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.900720 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:00Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.916071 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:00Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.930408 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:00Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.943720 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.943766 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.943777 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.943793 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.943805 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:00Z","lastTransitionTime":"2025-11-25T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.950885 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:00Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.965135 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:00Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.979087 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:00Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:00 crc kubenswrapper[4693]: I1125 12:09:00.992142 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:00Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.001795 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:00Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.013413 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:01Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.032696 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:01Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.046604 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.046651 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.047238 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.047755 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.047929 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:01Z","lastTransitionTime":"2025-11-25T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.048300 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:01Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.059949 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:01Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.072602 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:01Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.084166 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:01Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.150213 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.150274 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.150288 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.150306 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.150321 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:01Z","lastTransitionTime":"2025-11-25T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.252793 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.252826 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.252835 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.252848 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.252857 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:01Z","lastTransitionTime":"2025-11-25T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.355999 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.356071 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.356088 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.356113 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.356131 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:01Z","lastTransitionTime":"2025-11-25T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.459740 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.459786 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.459800 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.459819 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.459833 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:01Z","lastTransitionTime":"2025-11-25T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.562740 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.563008 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.563099 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.563191 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.563284 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:01Z","lastTransitionTime":"2025-11-25T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.666173 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.666211 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.666223 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.666239 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.666250 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:01Z","lastTransitionTime":"2025-11-25T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.769084 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.769128 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.769143 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.769160 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.769174 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:01Z","lastTransitionTime":"2025-11-25T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.811963 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:01 crc kubenswrapper[4693]: E1125 12:09:01.812155 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.871722 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.871771 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.871782 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.871800 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.871811 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:01Z","lastTransitionTime":"2025-11-25T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.974355 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.974493 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.974531 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.974563 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:01 crc kubenswrapper[4693]: I1125 12:09:01.974586 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:01Z","lastTransitionTime":"2025-11-25T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.077077 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.077152 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.077185 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.077218 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.077240 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:02Z","lastTransitionTime":"2025-11-25T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.180301 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.180353 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.180368 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.180419 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.180433 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:02Z","lastTransitionTime":"2025-11-25T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.283966 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.284043 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.284071 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.284105 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.284129 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:02Z","lastTransitionTime":"2025-11-25T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.387653 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.387710 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.387724 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.387743 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.387758 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:02Z","lastTransitionTime":"2025-11-25T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.490948 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.491014 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.491036 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.491063 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.491100 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:02Z","lastTransitionTime":"2025-11-25T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.594164 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.594217 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.594226 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.594240 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.594252 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:02Z","lastTransitionTime":"2025-11-25T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.697568 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.697649 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.697668 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.697691 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.697707 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:02Z","lastTransitionTime":"2025-11-25T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.800632 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.800685 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.800695 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.800717 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.800729 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:02Z","lastTransitionTime":"2025-11-25T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.812116 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.812173 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:02 crc kubenswrapper[4693]: E1125 12:09:02.812286 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.812445 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:02 crc kubenswrapper[4693]: E1125 12:09:02.812802 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:02 crc kubenswrapper[4693]: E1125 12:09:02.813081 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.813161 4693 scope.go:117] "RemoveContainer" containerID="c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4" Nov 25 12:09:02 crc kubenswrapper[4693]: E1125 12:09:02.813433 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.903313 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.904040 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.904116 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.904181 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:02 crc kubenswrapper[4693]: I1125 12:09:02.904241 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:02Z","lastTransitionTime":"2025-11-25T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.007412 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.007463 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.007476 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.007494 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.007506 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:03Z","lastTransitionTime":"2025-11-25T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.109884 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.109929 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.109940 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.109957 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.109969 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:03Z","lastTransitionTime":"2025-11-25T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.212222 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.212261 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.212273 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.212294 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.212309 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:03Z","lastTransitionTime":"2025-11-25T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.319363 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.319421 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.319432 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.319449 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.319463 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:03Z","lastTransitionTime":"2025-11-25T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.421459 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.421774 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.421869 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.421947 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.422017 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:03Z","lastTransitionTime":"2025-11-25T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.524646 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.524705 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.524724 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.524745 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.524761 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:03Z","lastTransitionTime":"2025-11-25T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.627095 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.627129 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.627139 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.627163 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.627174 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:03Z","lastTransitionTime":"2025-11-25T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.729582 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.729632 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.729647 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.729671 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.729686 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:03Z","lastTransitionTime":"2025-11-25T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.811937 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:03 crc kubenswrapper[4693]: E1125 12:09:03.812466 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.831858 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.832222 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.832440 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.832603 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.832738 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:03Z","lastTransitionTime":"2025-11-25T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.935920 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.936875 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.937043 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.937205 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:03 crc kubenswrapper[4693]: I1125 12:09:03.937351 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:03Z","lastTransitionTime":"2025-11-25T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.040853 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.041165 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.041286 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.041450 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.041594 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:04Z","lastTransitionTime":"2025-11-25T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.144809 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.145138 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.145590 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.145986 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.146447 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:04Z","lastTransitionTime":"2025-11-25T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.250073 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.250345 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.250497 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.250701 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.250794 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:04Z","lastTransitionTime":"2025-11-25T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.353633 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.354118 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.354268 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.354404 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.354501 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:04Z","lastTransitionTime":"2025-11-25T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.457795 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.458196 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.458490 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.458617 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.458756 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:04Z","lastTransitionTime":"2025-11-25T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.561543 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.561604 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.561647 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.561672 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.561685 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:04Z","lastTransitionTime":"2025-11-25T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.664550 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.664951 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.665105 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.665239 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.665399 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:04Z","lastTransitionTime":"2025-11-25T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.768320 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.768611 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.768817 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.769037 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.769240 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:04Z","lastTransitionTime":"2025-11-25T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.812192 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:04 crc kubenswrapper[4693]: E1125 12:09:04.812696 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.812526 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:04 crc kubenswrapper[4693]: E1125 12:09:04.813145 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.812792 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:04 crc kubenswrapper[4693]: E1125 12:09:04.813623 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.873098 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.873146 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.873159 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.873178 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.873192 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:04Z","lastTransitionTime":"2025-11-25T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.976519 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.977413 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.977577 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.977727 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:04 crc kubenswrapper[4693]: I1125 12:09:04.977854 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:04Z","lastTransitionTime":"2025-11-25T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.081169 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.081240 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.081264 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.081288 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.081303 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:05Z","lastTransitionTime":"2025-11-25T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.183878 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.183939 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.183949 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.183968 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.183984 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:05Z","lastTransitionTime":"2025-11-25T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.287172 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.287454 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.287521 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.287617 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.287703 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:05Z","lastTransitionTime":"2025-11-25T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.390055 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.390108 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.390118 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.390135 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.390146 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:05Z","lastTransitionTime":"2025-11-25T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.492991 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.493037 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.493048 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.493065 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.493077 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:05Z","lastTransitionTime":"2025-11-25T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.596079 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.596141 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.596152 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.596169 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.596180 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:05Z","lastTransitionTime":"2025-11-25T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.699185 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.699220 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.699230 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.699246 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.699259 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:05Z","lastTransitionTime":"2025-11-25T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.801758 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.802024 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.802103 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.802199 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.802292 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:05Z","lastTransitionTime":"2025-11-25T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.812105 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:05 crc kubenswrapper[4693]: E1125 12:09:05.812264 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.905480 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.905521 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.905533 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.905550 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:05 crc kubenswrapper[4693]: I1125 12:09:05.905562 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:05Z","lastTransitionTime":"2025-11-25T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.007681 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.007713 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.007730 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.007745 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.007755 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:06Z","lastTransitionTime":"2025-11-25T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.110913 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.110975 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.110996 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.111025 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.111048 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:06Z","lastTransitionTime":"2025-11-25T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.213228 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.213259 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.213270 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.213284 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.213295 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:06Z","lastTransitionTime":"2025-11-25T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.315105 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.315160 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.315174 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.315193 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.315206 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:06Z","lastTransitionTime":"2025-11-25T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.417335 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.417383 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.417394 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.417411 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.417422 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:06Z","lastTransitionTime":"2025-11-25T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.520409 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.520447 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.520459 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.520503 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.520517 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:06Z","lastTransitionTime":"2025-11-25T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.623177 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.623224 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.623239 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.623258 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.623272 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:06Z","lastTransitionTime":"2025-11-25T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.725611 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.726094 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.726160 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.726227 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.726290 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:06Z","lastTransitionTime":"2025-11-25T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.812359 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.812510 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:06 crc kubenswrapper[4693]: E1125 12:09:06.812663 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.812851 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:06 crc kubenswrapper[4693]: E1125 12:09:06.812941 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:06 crc kubenswrapper[4693]: E1125 12:09:06.813195 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.829237 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.829306 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.829327 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.829349 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.829366 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:06Z","lastTransitionTime":"2025-11-25T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.932624 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.932660 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.932671 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.932688 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:06 crc kubenswrapper[4693]: I1125 12:09:06.932699 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:06Z","lastTransitionTime":"2025-11-25T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.034831 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.034862 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.034873 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.034889 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.034900 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:07Z","lastTransitionTime":"2025-11-25T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.137519 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.137552 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.137564 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.137580 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.137591 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:07Z","lastTransitionTime":"2025-11-25T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.240118 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.240173 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.240184 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.240205 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.240217 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:07Z","lastTransitionTime":"2025-11-25T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.342507 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.342550 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.342560 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.342577 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.342590 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:07Z","lastTransitionTime":"2025-11-25T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.445327 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.445673 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.445771 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.445865 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.445961 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:07Z","lastTransitionTime":"2025-11-25T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.549031 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.549071 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.549081 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.549097 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.549106 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:07Z","lastTransitionTime":"2025-11-25T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.651716 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.651763 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.651778 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.651798 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.651811 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:07Z","lastTransitionTime":"2025-11-25T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.754523 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.754601 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.754627 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.754661 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.754684 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:07Z","lastTransitionTime":"2025-11-25T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.812461 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:07 crc kubenswrapper[4693]: E1125 12:09:07.812651 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.856870 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.857152 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.857253 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.857390 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.857525 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:07Z","lastTransitionTime":"2025-11-25T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.959965 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.960275 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.960385 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.960494 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:07 crc kubenswrapper[4693]: I1125 12:09:07.960591 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:07Z","lastTransitionTime":"2025-11-25T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.063027 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.063096 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.063107 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.063123 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.063133 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.165638 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.165672 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.165679 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.165692 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.165700 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.267834 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.267870 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.267880 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.267896 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.267907 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.289094 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.289141 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.289149 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.289162 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.289172 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: E1125 12:09:08.301060 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:08Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.304080 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.304111 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.304121 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.304134 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.304144 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: E1125 12:09:08.318654 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:08Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.322856 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.322899 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.322908 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.322920 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.322930 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: E1125 12:09:08.340249 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:08Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.343442 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.343476 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.343485 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.343498 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.343506 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: E1125 12:09:08.353703 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:08Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.357485 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.357577 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.357601 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.358088 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.358364 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: E1125 12:09:08.369615 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:08Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:08 crc kubenswrapper[4693]: E1125 12:09:08.369799 4693 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.371584 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.371618 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.371628 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.371646 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.371658 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.473919 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.473951 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.473960 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.473973 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.473984 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.576601 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.576640 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.576653 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.576670 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.576684 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.681531 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.681611 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.681640 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.681675 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.681700 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.784438 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.784476 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.784488 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.784503 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.784514 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.812435 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:08 crc kubenswrapper[4693]: E1125 12:09:08.812536 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.812562 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:08 crc kubenswrapper[4693]: E1125 12:09:08.812620 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.812778 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:08 crc kubenswrapper[4693]: E1125 12:09:08.812940 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.887826 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.887896 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.887911 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.887929 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.887944 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.990062 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.990087 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.990100 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.990115 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:08 crc kubenswrapper[4693]: I1125 12:09:08.990128 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:08Z","lastTransitionTime":"2025-11-25T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.091900 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.091945 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.091955 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.091969 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.091977 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:09Z","lastTransitionTime":"2025-11-25T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.195313 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.195343 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.195359 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.195388 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.195401 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:09Z","lastTransitionTime":"2025-11-25T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.298665 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.298759 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.298774 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.298793 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.298827 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:09Z","lastTransitionTime":"2025-11-25T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.401784 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.401849 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.401866 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.401890 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.401908 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:09Z","lastTransitionTime":"2025-11-25T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.504827 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.504998 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.505024 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.505048 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.505108 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:09Z","lastTransitionTime":"2025-11-25T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.606903 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.606965 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.606982 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.607006 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.607023 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:09Z","lastTransitionTime":"2025-11-25T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.710060 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.710136 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.710160 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.710193 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.710215 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:09Z","lastTransitionTime":"2025-11-25T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.812183 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:09 crc kubenswrapper[4693]: E1125 12:09:09.812349 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.814411 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.814452 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.814464 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.814500 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.814514 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:09Z","lastTransitionTime":"2025-11-25T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.917221 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.917261 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.917272 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.917288 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:09 crc kubenswrapper[4693]: I1125 12:09:09.917299 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:09Z","lastTransitionTime":"2025-11-25T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.019431 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.019494 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.019506 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.019526 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.019538 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:10Z","lastTransitionTime":"2025-11-25T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.122269 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.122325 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.122336 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.122356 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.122371 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:10Z","lastTransitionTime":"2025-11-25T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.224276 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.224310 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.224318 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.224331 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.224340 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:10Z","lastTransitionTime":"2025-11-25T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.326972 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.326999 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.327007 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.327038 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.327047 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:10Z","lastTransitionTime":"2025-11-25T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.430082 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.430131 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.430142 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.430158 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.430169 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:10Z","lastTransitionTime":"2025-11-25T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.534151 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.534199 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.534212 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.534235 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.534246 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:10Z","lastTransitionTime":"2025-11-25T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.636244 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.636287 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.636297 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.636313 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.636325 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:10Z","lastTransitionTime":"2025-11-25T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.739419 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.739479 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.739512 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.739542 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.739562 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:10Z","lastTransitionTime":"2025-11-25T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.812715 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.812750 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.812792 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:10 crc kubenswrapper[4693]: E1125 12:09:10.812926 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:10 crc kubenswrapper[4693]: E1125 12:09:10.813054 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:10 crc kubenswrapper[4693]: E1125 12:09:10.813121 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.830631 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:10Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.841511 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.841555 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.841569 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.841585 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.841596 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:10Z","lastTransitionTime":"2025-11-25T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.847596 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:10Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.868902 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:10Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.880357 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:10Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.910118 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:10Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.925291 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:10Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.944234 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.944276 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.944286 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.944302 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.944313 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:10Z","lastTransitionTime":"2025-11-25T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.944190 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:10Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.959594 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:10Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.971347 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:10Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:10 crc kubenswrapper[4693]: I1125 12:09:10.985309 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:10Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.002097 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:10Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.020151 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:11Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.037999 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:11Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.046982 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.047019 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.047028 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.047043 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.047054 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:11Z","lastTransitionTime":"2025-11-25T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.058840 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:49Z\\\",\\\"message\\\":\\\"p[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 12:08:49.240312 6339 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 12:08:49.240319 6339 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:49.240327 6339 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:49.240283 6339 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm in node crc\\\\nI1125 12:08:49.240352 6339 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm after 0 failed attempt(s)\\\\nI1125 12:08:49.240359 6339 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sn9jm\\\\nI1125 12:08:49.238439 6339 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 3.112868ms\\\\nF1125 12:08:49.240387 6339 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:11Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.070599 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:11Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.085418 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:11Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.098957 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0d8f418-ddc7-48e4-9d11-4567bc98232e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aad6fba70273e809f50e6a66fbb6fa507e315cef0b0c2b0fb6c635e306928d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e72b10217a3f81425d1f0243df0f4a40ab73aba0e06403d000bfacf0b6a6ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08df000c8fd24ff58f995a96bc5bf8e665130996de15cf6d139575dcb8284002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:11Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.111194 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:11Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.149953 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.150043 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.150055 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.150071 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.150081 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:11Z","lastTransitionTime":"2025-11-25T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.252915 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.252968 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.252985 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.253011 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.253191 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:11Z","lastTransitionTime":"2025-11-25T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.356740 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.356825 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.357056 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.357086 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.357114 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:11Z","lastTransitionTime":"2025-11-25T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.465407 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.465447 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.465457 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.465470 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.465480 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:11Z","lastTransitionTime":"2025-11-25T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.568026 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.568105 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.568122 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.568146 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.568164 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:11Z","lastTransitionTime":"2025-11-25T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.671397 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.671446 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.671458 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.671477 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.671489 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:11Z","lastTransitionTime":"2025-11-25T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.775145 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.775214 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.775233 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.775259 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.775278 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:11Z","lastTransitionTime":"2025-11-25T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.812436 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:11 crc kubenswrapper[4693]: E1125 12:09:11.812557 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.878039 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.878131 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.878148 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.878171 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.878188 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:11Z","lastTransitionTime":"2025-11-25T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.980535 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.980608 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.980632 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.980664 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:11 crc kubenswrapper[4693]: I1125 12:09:11.980687 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:11Z","lastTransitionTime":"2025-11-25T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.083011 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.083070 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.083087 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.083111 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.083125 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:12Z","lastTransitionTime":"2025-11-25T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.125747 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:12 crc kubenswrapper[4693]: E1125 12:09:12.125894 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:09:12 crc kubenswrapper[4693]: E1125 12:09:12.125979 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs podName:a10eb19c-b500-4cf9-961d-1892ba67560a nodeName:}" failed. No retries permitted until 2025-11-25 12:09:44.125958916 +0000 UTC m=+104.044044297 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs") pod "network-metrics-daemon-n2f89" (UID: "a10eb19c-b500-4cf9-961d-1892ba67560a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.185957 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.185999 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.186016 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.186037 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.186052 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:12Z","lastTransitionTime":"2025-11-25T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.288711 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.288744 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.288752 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.288768 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.288777 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:12Z","lastTransitionTime":"2025-11-25T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.390911 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.390959 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.390970 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.390988 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.391000 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:12Z","lastTransitionTime":"2025-11-25T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.493931 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.493976 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.493987 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.494000 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.494008 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:12Z","lastTransitionTime":"2025-11-25T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.596413 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.596448 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.596460 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.596475 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.596485 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:12Z","lastTransitionTime":"2025-11-25T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.698708 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.698755 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.698767 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.698787 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.698800 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:12Z","lastTransitionTime":"2025-11-25T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.801029 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.801061 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.801070 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.801084 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.801094 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:12Z","lastTransitionTime":"2025-11-25T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.812496 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.812511 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:12 crc kubenswrapper[4693]: E1125 12:09:12.812651 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:12 crc kubenswrapper[4693]: E1125 12:09:12.812733 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.812461 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:12 crc kubenswrapper[4693]: E1125 12:09:12.812984 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.903759 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.903837 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.903859 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.903894 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:12 crc kubenswrapper[4693]: I1125 12:09:12.903918 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:12Z","lastTransitionTime":"2025-11-25T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.007469 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.007509 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.007521 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.007538 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.007550 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:13Z","lastTransitionTime":"2025-11-25T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.109781 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.109838 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.109858 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.109888 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.109915 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:13Z","lastTransitionTime":"2025-11-25T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.212007 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.212045 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.212056 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.212071 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.212083 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:13Z","lastTransitionTime":"2025-11-25T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.314141 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.314176 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.314184 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.314199 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.314207 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:13Z","lastTransitionTime":"2025-11-25T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.419160 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.419214 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.419231 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.419249 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.419258 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:13Z","lastTransitionTime":"2025-11-25T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.521494 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.521537 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.521546 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.521562 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.521571 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:13Z","lastTransitionTime":"2025-11-25T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.623827 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.623866 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.623875 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.623890 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.623899 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:13Z","lastTransitionTime":"2025-11-25T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.725919 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.725959 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.725967 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.725983 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.725991 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:13Z","lastTransitionTime":"2025-11-25T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.812301 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:13 crc kubenswrapper[4693]: E1125 12:09:13.812464 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.828393 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.828448 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.828463 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.828479 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.828489 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:13Z","lastTransitionTime":"2025-11-25T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.930774 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.930822 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.930836 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.930863 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:13 crc kubenswrapper[4693]: I1125 12:09:13.930877 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:13Z","lastTransitionTime":"2025-11-25T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.033753 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.033810 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.033825 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.033845 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.033862 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:14Z","lastTransitionTime":"2025-11-25T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.137137 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.137204 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.137222 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.137245 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.137262 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:14Z","lastTransitionTime":"2025-11-25T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.232846 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6l9jx_f714b419-cf37-48b7-9b1a-d36291d788a0/kube-multus/0.log" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.232935 4693 generic.go:334] "Generic (PLEG): container finished" podID="f714b419-cf37-48b7-9b1a-d36291d788a0" containerID="79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec" exitCode=1 Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.232984 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6l9jx" event={"ID":"f714b419-cf37-48b7-9b1a-d36291d788a0","Type":"ContainerDied","Data":"79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec"} Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.233695 4693 scope.go:117] "RemoveContainer" containerID="79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.240579 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.240675 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.240702 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.240733 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.240758 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:14Z","lastTransitionTime":"2025-11-25T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.248631 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.271423 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.285164 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.297917 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.310137 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.320693 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.330802 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.343633 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.343674 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.343681 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.343712 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.343722 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:14Z","lastTransitionTime":"2025-11-25T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.349812 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:49Z\\\",\\\"message\\\":\\\"p[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 12:08:49.240312 6339 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 12:08:49.240319 6339 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:49.240327 6339 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:49.240283 6339 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm in node crc\\\\nI1125 12:08:49.240352 6339 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm after 0 failed attempt(s)\\\\nI1125 12:08:49.240359 6339 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sn9jm\\\\nI1125 12:08:49.238439 6339 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 3.112868ms\\\\nF1125 12:08:49.240387 6339 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.358977 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.370416 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0d8f418-ddc7-48e4-9d11-4567bc98232e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aad6fba70273e809f50e6a66fbb6fa507e315cef0b0c2b0fb6c635e306928d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e72b10217a3f81425d1f0243df0f4a40ab73aba0e06403d000bfacf0b6a6ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08df000c8fd24ff58f995a96bc5bf8e665130996de15cf6d139575dcb8284002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.381016 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.392184 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:09:13Z\\\",\\\"message\\\":\\\"2025-11-25T12:08:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90\\\\n2025-11-25T12:08:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90 to /host/opt/cni/bin/\\\\n2025-11-25T12:08:28Z [verbose] multus-daemon started\\\\n2025-11-25T12:08:28Z [verbose] Readiness Indicator file check\\\\n2025-11-25T12:09:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.403612 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.423639 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.435958 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.447261 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.447294 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.447305 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.447320 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.447333 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:14Z","lastTransitionTime":"2025-11-25T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.447742 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.458311 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.473212 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:14Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.550058 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.550099 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.550111 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.550131 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.550168 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:14Z","lastTransitionTime":"2025-11-25T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.652413 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.652455 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.652469 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.652489 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.652502 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:14Z","lastTransitionTime":"2025-11-25T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.754887 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.754922 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.754931 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.754944 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.754966 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:14Z","lastTransitionTime":"2025-11-25T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.812646 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.812678 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.812678 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:14 crc kubenswrapper[4693]: E1125 12:09:14.812834 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:14 crc kubenswrapper[4693]: E1125 12:09:14.812903 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:14 crc kubenswrapper[4693]: E1125 12:09:14.812982 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.857830 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.857861 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.857871 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.857900 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.857909 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:14Z","lastTransitionTime":"2025-11-25T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.960780 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.960818 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.960835 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.960851 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:14 crc kubenswrapper[4693]: I1125 12:09:14.960862 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:14Z","lastTransitionTime":"2025-11-25T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.063181 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.063220 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.063228 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.063255 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.063264 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:15Z","lastTransitionTime":"2025-11-25T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.165172 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.165205 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.165217 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.165233 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.165243 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:15Z","lastTransitionTime":"2025-11-25T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.236782 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6l9jx_f714b419-cf37-48b7-9b1a-d36291d788a0/kube-multus/0.log" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.236820 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6l9jx" event={"ID":"f714b419-cf37-48b7-9b1a-d36291d788a0","Type":"ContainerStarted","Data":"382211ae43e333d7bb7c5f1a1ab9556b12e5b61664925168b887ab596f56a486"} Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.259588 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.267248 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.267299 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.267310 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.267327 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.267338 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:15Z","lastTransitionTime":"2025-11-25T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.273456 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.286204 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.299824 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.316605 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.328600 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.369516 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.369546 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.369557 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.369573 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.369588 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:15Z","lastTransitionTime":"2025-11-25T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.376296 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.386633 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.395810 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.405591 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.416103 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.432435 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.448851 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.460183 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.471417 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.471587 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.471682 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.471773 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.471852 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:15Z","lastTransitionTime":"2025-11-25T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.480660 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:49Z\\\",\\\"message\\\":\\\"p[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 12:08:49.240312 6339 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 12:08:49.240319 6339 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:49.240327 6339 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:49.240283 6339 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm in node crc\\\\nI1125 12:08:49.240352 6339 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm after 0 failed attempt(s)\\\\nI1125 12:08:49.240359 6339 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sn9jm\\\\nI1125 12:08:49.238439 6339 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 3.112868ms\\\\nF1125 12:08:49.240387 6339 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.492661 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.504431 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382211ae43e333d7bb7c5f1a1ab9556b12e5b61664925168b887ab596f56a486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:09:13Z\\\",\\\"message\\\":\\\"2025-11-25T12:08:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90\\\\n2025-11-25T12:08:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90 to /host/opt/cni/bin/\\\\n2025-11-25T12:08:28Z [verbose] multus-daemon started\\\\n2025-11-25T12:08:28Z [verbose] Readiness Indicator file check\\\\n2025-11-25T12:09:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:09:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.516031 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0d8f418-ddc7-48e4-9d11-4567bc98232e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aad6fba70273e809f50e6a66fbb6fa507e315cef0b0c2b0fb6c635e306928d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e72b10217a3f81425d1f0243df0f4a40ab73aba0e06403d000bfacf0b6a6ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08df000c8fd24ff58f995a96bc5bf8e665130996de15cf6d139575dcb8284002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:15Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.574493 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.574535 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.574547 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.574562 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.574574 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:15Z","lastTransitionTime":"2025-11-25T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.677036 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.677083 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.677096 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.677112 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.677123 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:15Z","lastTransitionTime":"2025-11-25T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.779699 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.779948 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.780019 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.780099 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.780165 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:15Z","lastTransitionTime":"2025-11-25T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.812537 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:15 crc kubenswrapper[4693]: E1125 12:09:15.812683 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.882412 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.882442 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.882451 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.882465 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.882503 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:15Z","lastTransitionTime":"2025-11-25T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.984731 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.984972 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.985079 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.985157 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:15 crc kubenswrapper[4693]: I1125 12:09:15.985227 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:15Z","lastTransitionTime":"2025-11-25T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.087567 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.087632 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.087650 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.087675 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.087694 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:16Z","lastTransitionTime":"2025-11-25T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.190052 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.190109 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.190126 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.190150 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.190168 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:16Z","lastTransitionTime":"2025-11-25T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.291883 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.291932 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.291943 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.291959 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.291968 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:16Z","lastTransitionTime":"2025-11-25T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.395003 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.395043 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.395054 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.395070 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.395082 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:16Z","lastTransitionTime":"2025-11-25T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.497103 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.497142 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.497153 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.497167 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.497175 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:16Z","lastTransitionTime":"2025-11-25T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.599296 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.599342 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.599357 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.599411 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.599431 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:16Z","lastTransitionTime":"2025-11-25T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.703829 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.703867 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.703879 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.703897 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.703908 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:16Z","lastTransitionTime":"2025-11-25T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.805789 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.805821 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.805829 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.805840 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.805851 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:16Z","lastTransitionTime":"2025-11-25T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.812610 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.812657 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:16 crc kubenswrapper[4693]: E1125 12:09:16.812798 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.813109 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:16 crc kubenswrapper[4693]: E1125 12:09:16.813237 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:16 crc kubenswrapper[4693]: E1125 12:09:16.813257 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.813450 4693 scope.go:117] "RemoveContainer" containerID="c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.907754 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.907778 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.907786 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.907798 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:16 crc kubenswrapper[4693]: I1125 12:09:16.907806 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:16Z","lastTransitionTime":"2025-11-25T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.010472 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.010521 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.010532 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.010551 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.010563 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:17Z","lastTransitionTime":"2025-11-25T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.112863 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.112904 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.112915 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.112931 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.112943 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:17Z","lastTransitionTime":"2025-11-25T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.215679 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.215721 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.215732 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.215748 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.215760 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:17Z","lastTransitionTime":"2025-11-25T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.245862 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/2.log" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.249364 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0"} Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.252985 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.265979 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.280474 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.293168 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.305497 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.318429 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.318464 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.318473 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.318487 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.318496 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:17Z","lastTransitionTime":"2025-11-25T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.318976 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.331667 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.342346 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.358437 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:49Z\\\",\\\"message\\\":\\\"p[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 12:08:49.240312 6339 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 12:08:49.240319 6339 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:49.240327 6339 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:49.240283 6339 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm in node crc\\\\nI1125 12:08:49.240352 6339 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm after 0 failed attempt(s)\\\\nI1125 12:08:49.240359 6339 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sn9jm\\\\nI1125 12:08:49.238439 6339 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 3.112868ms\\\\nF1125 12:08:49.240387 6339 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:09:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.368805 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.379985 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.391440 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382211ae43e333d7bb7c5f1a1ab9556b12e5b61664925168b887ab596f56a486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:09:13Z\\\",\\\"message\\\":\\\"2025-11-25T12:08:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90\\\\n2025-11-25T12:08:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90 to /host/opt/cni/bin/\\\\n2025-11-25T12:08:28Z [verbose] multus-daemon started\\\\n2025-11-25T12:08:28Z [verbose] Readiness Indicator file check\\\\n2025-11-25T12:09:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:09:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.404842 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0d8f418-ddc7-48e4-9d11-4567bc98232e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aad6fba70273e809f50e6a66fbb6fa507e315cef0b0c2b0fb6c635e306928d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e72b10217a3f81425d1f0243df0f4a40ab73aba0e06403d000bfacf0b6a6ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08df000c8fd24ff58f995a96bc5bf8e665130996de15cf6d139575dcb8284002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.415922 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.421091 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.421232 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.421320 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.421413 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.421485 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:17Z","lastTransitionTime":"2025-11-25T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.426473 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.435470 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.448421 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.459343 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.479920 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:17Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.523653 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.523686 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.523695 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.523707 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.523716 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:17Z","lastTransitionTime":"2025-11-25T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.627241 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.627566 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.627582 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.627602 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.627625 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:17Z","lastTransitionTime":"2025-11-25T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.729839 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.729884 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.729895 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.729911 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.729923 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:17Z","lastTransitionTime":"2025-11-25T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.812627 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:17 crc kubenswrapper[4693]: E1125 12:09:17.812749 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.832512 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.832549 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.832557 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.832571 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.832579 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:17Z","lastTransitionTime":"2025-11-25T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.935295 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.935337 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.935345 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.935358 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:17 crc kubenswrapper[4693]: I1125 12:09:17.935382 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:17Z","lastTransitionTime":"2025-11-25T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.038164 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.038199 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.038209 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.038225 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.038236 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.140697 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.140753 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.140763 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.140782 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.140809 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.244140 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.244193 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.244204 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.244223 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.244237 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.254801 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/3.log" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.255798 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/2.log" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.258786 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0" exitCode=1 Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.258842 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.258897 4693 scope.go:117] "RemoveContainer" containerID="c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.259519 4693 scope.go:117] "RemoveContainer" containerID="06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0" Nov 25 12:09:18 crc kubenswrapper[4693]: E1125 12:09:18.259672 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.273137 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.284922 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.299308 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.312327 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.332334 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.346921 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.346965 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.346974 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.346987 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.346995 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.349602 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.369983 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.384499 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.397253 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.413281 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.434611 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.449350 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.450723 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.450770 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.450781 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.450795 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.450804 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.461312 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.482297 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c018f0daef98682c8cf07624713a1c898110bda8a34b241e9c47f7fa19fb1cf4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:08:49Z\\\",\\\"message\\\":\\\"p[10.217.5.245:443: 10.217.5.245:9192:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {54fbe873-7e6d-475f-a0ad-8dd5f06d850d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1125 12:08:49.240312 6339 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1125 12:08:49.240319 6339 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1125 12:08:49.240327 6339 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:08:49.240283 6339 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm in node crc\\\\nI1125 12:08:49.240352 6339 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sn9jm after 0 failed attempt(s)\\\\nI1125 12:08:49.240359 6339 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sn9jm\\\\nI1125 12:08:49.238439 6339 services_controller.go:360] Finished syncing service downloads on namespace openshift-console for network=default : 3.112868ms\\\\nF1125 12:08:49.240387 6339 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"roller took 0.122850588 seconds. No OVN measurement.\\\\nI1125 12:09:18.094588 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 12:09:18.094622 6701 factory.go:656] Stopping watch factory\\\\nI1125 12:09:18.094632 6701 ovnkube.go:599] Stopped ovnkube\\\\nI1125 12:09:18.094626 6701 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1125 12:09:18.094647 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 12:09:18.094654 6701 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-storage-version-migrator-operator for network=default : 1.204986ms\\\\nI1125 12:09:18.094670 6701 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:09:18.094673 6701 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}\\\\nI1125 12:09:18.094683 6701 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 8.646697ms\\\\nF1125 12:09:18.094730 6701 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:09:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.482815 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.482844 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.482852 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.482867 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.482877 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: E1125 12:09:18.493974 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.496429 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.497550 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.497574 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.497583 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.497599 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.497611 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: E1125 12:09:18.508679 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.510584 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382211ae43e333d7bb7c5f1a1ab9556b12e5b61664925168b887ab596f56a486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:09:13Z\\\",\\\"message\\\":\\\"2025-11-25T12:08:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90\\\\n2025-11-25T12:08:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90 to /host/opt/cni/bin/\\\\n2025-11-25T12:08:28Z [verbose] multus-daemon started\\\\n2025-11-25T12:08:28Z [verbose] Readiness Indicator file check\\\\n2025-11-25T12:09:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:09:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.512057 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.512093 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.512103 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.512119 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.512129 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.522546 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0d8f418-ddc7-48e4-9d11-4567bc98232e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aad6fba70273e809f50e6a66fbb6fa507e315cef0b0c2b0fb6c635e306928d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e72b10217a3f81425d1f0243df0f4a40ab73aba0e06403d000bfacf0b6a6ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08df000c8fd24ff58f995a96bc5bf8e665130996de15cf6d139575dcb8284002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: E1125 12:09:18.525835 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.529174 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.529225 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.529234 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.529246 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.529256 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.532907 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: E1125 12:09:18.541501 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.544480 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.544504 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.544512 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.544523 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.544533 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: E1125 12:09:18.556876 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:18Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:18 crc kubenswrapper[4693]: E1125 12:09:18.556992 4693 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.558491 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.558521 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.558528 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.558546 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.558557 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.660536 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.660582 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.660593 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.660622 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.660635 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.764579 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.764750 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.764764 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.764784 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.764795 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.812471 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.812531 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:18 crc kubenswrapper[4693]: E1125 12:09:18.812611 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.812729 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:18 crc kubenswrapper[4693]: E1125 12:09:18.812830 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:18 crc kubenswrapper[4693]: E1125 12:09:18.812962 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.867145 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.867178 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.867187 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.867202 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.867211 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.969658 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.969693 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.969702 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.969715 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:18 crc kubenswrapper[4693]: I1125 12:09:18.969724 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:18Z","lastTransitionTime":"2025-11-25T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.072816 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.072867 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.072880 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.072897 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.072911 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:19Z","lastTransitionTime":"2025-11-25T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.175577 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.175639 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.175661 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.175686 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.175703 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:19Z","lastTransitionTime":"2025-11-25T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.263214 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/3.log" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.266210 4693 scope.go:117] "RemoveContainer" containerID="06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0" Nov 25 12:09:19 crc kubenswrapper[4693]: E1125 12:09:19.266363 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.278006 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.280818 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.280887 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.280904 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.280926 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.280947 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:19Z","lastTransitionTime":"2025-11-25T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.291534 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.304673 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.315871 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.329141 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.342647 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.357851 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.379258 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"roller took 0.122850588 seconds. No OVN measurement.\\\\nI1125 12:09:18.094588 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 12:09:18.094622 6701 factory.go:656] Stopping watch factory\\\\nI1125 12:09:18.094632 6701 ovnkube.go:599] Stopped ovnkube\\\\nI1125 12:09:18.094626 6701 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1125 12:09:18.094647 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 12:09:18.094654 6701 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-storage-version-migrator-operator for network=default : 1.204986ms\\\\nI1125 12:09:18.094670 6701 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:09:18.094673 6701 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}\\\\nI1125 12:09:18.094683 6701 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 8.646697ms\\\\nF1125 12:09:18.094730 6701 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:09:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.391179 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.393355 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.393403 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.393412 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.393427 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.393439 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:19Z","lastTransitionTime":"2025-11-25T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.401430 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.414015 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382211ae43e333d7bb7c5f1a1ab9556b12e5b61664925168b887ab596f56a486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:09:13Z\\\",\\\"message\\\":\\\"2025-11-25T12:08:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90\\\\n2025-11-25T12:08:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90 to /host/opt/cni/bin/\\\\n2025-11-25T12:08:28Z [verbose] multus-daemon started\\\\n2025-11-25T12:08:28Z [verbose] Readiness Indicator file check\\\\n2025-11-25T12:09:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:09:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.427305 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0d8f418-ddc7-48e4-9d11-4567bc98232e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aad6fba70273e809f50e6a66fbb6fa507e315cef0b0c2b0fb6c635e306928d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e72b10217a3f81425d1f0243df0f4a40ab73aba0e06403d000bfacf0b6a6ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08df000c8fd24ff58f995a96bc5bf8e665130996de15cf6d139575dcb8284002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.439250 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.451258 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.461783 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.473966 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.485828 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.495247 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.495304 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.495316 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.495331 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.495360 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:19Z","lastTransitionTime":"2025-11-25T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.502701 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:19Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.598071 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.598117 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.598125 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.598138 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.598146 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:19Z","lastTransitionTime":"2025-11-25T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.700059 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.700096 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.700108 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.700123 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.700133 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:19Z","lastTransitionTime":"2025-11-25T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.802638 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.802716 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.802727 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.802751 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.802768 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:19Z","lastTransitionTime":"2025-11-25T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.811909 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:19 crc kubenswrapper[4693]: E1125 12:09:19.812153 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.906362 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.906428 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.906439 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.906456 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:19 crc kubenswrapper[4693]: I1125 12:09:19.906467 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:19Z","lastTransitionTime":"2025-11-25T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.009164 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.009200 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.009209 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.009223 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.009234 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:20Z","lastTransitionTime":"2025-11-25T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.111889 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.111974 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.111994 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.112049 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.112065 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:20Z","lastTransitionTime":"2025-11-25T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.215461 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.215493 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.215502 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.215515 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.215524 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:20Z","lastTransitionTime":"2025-11-25T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.317323 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.317410 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.317422 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.317438 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.317449 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:20Z","lastTransitionTime":"2025-11-25T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.420409 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.420482 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.420496 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.420515 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.420528 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:20Z","lastTransitionTime":"2025-11-25T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.523646 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.523768 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.523792 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.523822 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.523884 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:20Z","lastTransitionTime":"2025-11-25T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.627738 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.627806 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.627816 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.627832 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.627842 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:20Z","lastTransitionTime":"2025-11-25T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.730282 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.730333 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.730343 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.730356 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.730366 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:20Z","lastTransitionTime":"2025-11-25T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.813059 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:20 crc kubenswrapper[4693]: E1125 12:09:20.813178 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.813404 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:20 crc kubenswrapper[4693]: E1125 12:09:20.813465 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.813675 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:20 crc kubenswrapper[4693]: E1125 12:09:20.813731 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.832812 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:20Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.833404 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.833438 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.833449 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.833486 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.833497 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:20Z","lastTransitionTime":"2025-11-25T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.852289 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:20Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.865545 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:20Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.877155 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:20Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.889308 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:20Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.903616 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:20Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.913539 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:20Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.935970 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.936011 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.936019 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.936032 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.936041 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:20Z","lastTransitionTime":"2025-11-25T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.946051 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"roller took 0.122850588 seconds. No OVN measurement.\\\\nI1125 12:09:18.094588 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 12:09:18.094622 6701 factory.go:656] Stopping watch factory\\\\nI1125 12:09:18.094632 6701 ovnkube.go:599] Stopped ovnkube\\\\nI1125 12:09:18.094626 6701 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1125 12:09:18.094647 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 12:09:18.094654 6701 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-storage-version-migrator-operator for network=default : 1.204986ms\\\\nI1125 12:09:18.094670 6701 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:09:18.094673 6701 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}\\\\nI1125 12:09:18.094683 6701 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 8.646697ms\\\\nF1125 12:09:18.094730 6701 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:09:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:20Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.963116 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:20Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.976591 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0d8f418-ddc7-48e4-9d11-4567bc98232e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aad6fba70273e809f50e6a66fbb6fa507e315cef0b0c2b0fb6c635e306928d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e72b10217a3f81425d1f0243df0f4a40ab73aba0e06403d000bfacf0b6a6ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08df000c8fd24ff58f995a96bc5bf8e665130996de15cf6d139575dcb8284002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:20Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:20 crc kubenswrapper[4693]: I1125 12:09:20.990545 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:20Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.005221 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382211ae43e333d7bb7c5f1a1ab9556b12e5b61664925168b887ab596f56a486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:09:13Z\\\",\\\"message\\\":\\\"2025-11-25T12:08:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90\\\\n2025-11-25T12:08:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90 to /host/opt/cni/bin/\\\\n2025-11-25T12:08:28Z [verbose] multus-daemon started\\\\n2025-11-25T12:08:28Z [verbose] Readiness Indicator file check\\\\n2025-11-25T12:09:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:09:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:21Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.021836 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:21Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.038024 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.038068 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.038096 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.038117 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.038130 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:21Z","lastTransitionTime":"2025-11-25T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.040443 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:21Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.052428 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:21Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.063308 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:21Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.074882 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:21Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.092335 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:21Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.141043 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.141094 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.141110 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.141132 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.141148 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:21Z","lastTransitionTime":"2025-11-25T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.244045 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.244095 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.244106 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.244124 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.244136 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:21Z","lastTransitionTime":"2025-11-25T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.347008 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.347052 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.347060 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.347076 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.347085 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:21Z","lastTransitionTime":"2025-11-25T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.450276 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.450322 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.450339 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.450360 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.450414 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:21Z","lastTransitionTime":"2025-11-25T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.552777 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.552819 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.552829 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.552842 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.552852 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:21Z","lastTransitionTime":"2025-11-25T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.655586 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.655633 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.655645 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.655660 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.655670 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:21Z","lastTransitionTime":"2025-11-25T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.758134 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.758187 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.758205 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.758228 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.758245 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:21Z","lastTransitionTime":"2025-11-25T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.812127 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:21 crc kubenswrapper[4693]: E1125 12:09:21.812341 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.861149 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.861204 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.861221 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.861244 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.861262 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:21Z","lastTransitionTime":"2025-11-25T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.964188 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.964271 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.964295 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.964326 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:21 crc kubenswrapper[4693]: I1125 12:09:21.964347 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:21Z","lastTransitionTime":"2025-11-25T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.067044 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.067105 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.067116 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.067138 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.067152 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:22Z","lastTransitionTime":"2025-11-25T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.169799 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.169859 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.169874 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.169893 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.169908 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:22Z","lastTransitionTime":"2025-11-25T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.272758 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.272810 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.272825 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.272848 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.272865 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:22Z","lastTransitionTime":"2025-11-25T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.376129 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.376193 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.376210 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.376232 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.376248 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:22Z","lastTransitionTime":"2025-11-25T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.479558 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.479675 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.479733 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.479757 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.479773 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:22Z","lastTransitionTime":"2025-11-25T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.582974 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.583038 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.583057 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.583084 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.583100 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:22Z","lastTransitionTime":"2025-11-25T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.686121 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.686351 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.686359 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.686400 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.686412 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:22Z","lastTransitionTime":"2025-11-25T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.789097 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.789163 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.789181 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.789203 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.789220 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:22Z","lastTransitionTime":"2025-11-25T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.812605 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.812673 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:22 crc kubenswrapper[4693]: E1125 12:09:22.812729 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.812775 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:22 crc kubenswrapper[4693]: E1125 12:09:22.812890 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:22 crc kubenswrapper[4693]: E1125 12:09:22.813041 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.891623 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.891667 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.891676 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.891690 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.891699 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:22Z","lastTransitionTime":"2025-11-25T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.994942 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.995327 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.995523 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.995670 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:22 crc kubenswrapper[4693]: I1125 12:09:22.995798 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:22Z","lastTransitionTime":"2025-11-25T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.099007 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.099080 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.099103 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.099133 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.099155 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:23Z","lastTransitionTime":"2025-11-25T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.202230 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.202289 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.202311 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.202326 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.202335 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:23Z","lastTransitionTime":"2025-11-25T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.304110 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.304136 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.304144 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.304156 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.304183 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:23Z","lastTransitionTime":"2025-11-25T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.407495 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.407542 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.407552 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.407571 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.407586 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:23Z","lastTransitionTime":"2025-11-25T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.511085 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.511139 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.511152 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.511172 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.511187 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:23Z","lastTransitionTime":"2025-11-25T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.614432 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.614497 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.614509 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.614541 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.614553 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:23Z","lastTransitionTime":"2025-11-25T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.718361 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.718478 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.718501 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.718530 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.718551 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:23Z","lastTransitionTime":"2025-11-25T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.817453 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:23 crc kubenswrapper[4693]: E1125 12:09:23.817702 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.821951 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.822010 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.822027 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.822050 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.822067 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:23Z","lastTransitionTime":"2025-11-25T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.925079 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.925167 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.925184 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.925204 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:23 crc kubenswrapper[4693]: I1125 12:09:23.925218 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:23Z","lastTransitionTime":"2025-11-25T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.028118 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.028178 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.028194 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.028219 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.028234 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:24Z","lastTransitionTime":"2025-11-25T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.131053 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.131133 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.131154 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.131188 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.131255 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:24Z","lastTransitionTime":"2025-11-25T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.234637 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.234701 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.234712 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.234729 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.234740 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:24Z","lastTransitionTime":"2025-11-25T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.338260 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.338310 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.338320 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.338339 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.338350 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:24Z","lastTransitionTime":"2025-11-25T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.440980 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.441043 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.441061 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.441086 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.441104 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:24Z","lastTransitionTime":"2025-11-25T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.545049 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.545113 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.545134 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.545175 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.545195 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:24Z","lastTransitionTime":"2025-11-25T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.558434 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.558705 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.558665897 +0000 UTC m=+148.476751308 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.648811 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.648866 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.648883 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.648930 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.648946 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:24Z","lastTransitionTime":"2025-11-25T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.659495 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.659912 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.659805 4693 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.660045 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.660082 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.660051032 +0000 UTC m=+148.578136413 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.660142 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.660306 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.660333 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.660344 4693 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.660352 4693 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.660350 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.660463 4693 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.660490 4693 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.660417 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.660397552 +0000 UTC m=+148.578482933 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.660605 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.660578308 +0000 UTC m=+148.578663709 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.660631 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.660619189 +0000 UTC m=+148.578704580 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.751629 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.751690 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.751715 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.751740 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.751759 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:24Z","lastTransitionTime":"2025-11-25T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.811984 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.812110 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.812224 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.812027 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.812514 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:24 crc kubenswrapper[4693]: E1125 12:09:24.812551 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.854546 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.854615 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.854633 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.854659 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.854678 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:24Z","lastTransitionTime":"2025-11-25T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.957074 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.957113 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.957122 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.957138 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:24 crc kubenswrapper[4693]: I1125 12:09:24.957147 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:24Z","lastTransitionTime":"2025-11-25T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.060163 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.060220 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.060241 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.060264 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.060282 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:25Z","lastTransitionTime":"2025-11-25T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.163167 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.163219 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.163238 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.163265 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.163287 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:25Z","lastTransitionTime":"2025-11-25T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.267163 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.267228 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.267240 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.267262 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.267277 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:25Z","lastTransitionTime":"2025-11-25T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.369265 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.369291 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.369299 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.369314 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.369322 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:25Z","lastTransitionTime":"2025-11-25T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.472496 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.472556 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.472571 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.472593 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.472608 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:25Z","lastTransitionTime":"2025-11-25T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.576715 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.576761 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.576772 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.576830 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.576876 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:25Z","lastTransitionTime":"2025-11-25T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.680229 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.680277 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.680287 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.680305 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.680318 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:25Z","lastTransitionTime":"2025-11-25T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.784013 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.784110 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.784128 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.784154 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.784172 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:25Z","lastTransitionTime":"2025-11-25T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.812722 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:25 crc kubenswrapper[4693]: E1125 12:09:25.813232 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.886774 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.886841 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.886861 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.886886 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.886903 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:25Z","lastTransitionTime":"2025-11-25T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.989991 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.990055 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.990072 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.990098 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:25 crc kubenswrapper[4693]: I1125 12:09:25.990115 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:25Z","lastTransitionTime":"2025-11-25T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.093256 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.093321 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.093338 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.093362 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.093413 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:26Z","lastTransitionTime":"2025-11-25T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.195907 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.196011 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.196030 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.196056 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.196075 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:26Z","lastTransitionTime":"2025-11-25T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.299010 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.299074 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.299088 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.299112 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.299132 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:26Z","lastTransitionTime":"2025-11-25T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.402879 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.402939 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.402958 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.402987 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.403010 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:26Z","lastTransitionTime":"2025-11-25T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.506183 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.506257 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.506278 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.506306 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.506327 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:26Z","lastTransitionTime":"2025-11-25T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.609255 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.609311 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.609319 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.609347 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.609358 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:26Z","lastTransitionTime":"2025-11-25T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.712312 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.712367 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.712410 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.712436 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.712456 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:26Z","lastTransitionTime":"2025-11-25T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.812196 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:26 crc kubenswrapper[4693]: E1125 12:09:26.812456 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.812682 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.812813 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:26 crc kubenswrapper[4693]: E1125 12:09:26.812970 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:26 crc kubenswrapper[4693]: E1125 12:09:26.813129 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.820674 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.820759 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.820792 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.820843 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.820869 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:26Z","lastTransitionTime":"2025-11-25T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.923895 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.923955 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.923972 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.923996 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:26 crc kubenswrapper[4693]: I1125 12:09:26.924013 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:26Z","lastTransitionTime":"2025-11-25T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.027938 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.028019 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.028038 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.028060 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.028118 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:27Z","lastTransitionTime":"2025-11-25T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.131185 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.131242 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.131254 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.131554 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.131611 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:27Z","lastTransitionTime":"2025-11-25T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.234898 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.234966 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.234992 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.235025 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.235055 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:27Z","lastTransitionTime":"2025-11-25T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.338489 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.338576 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.338600 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.338641 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.338665 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:27Z","lastTransitionTime":"2025-11-25T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.442766 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.443072 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.443311 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.443599 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.443756 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:27Z","lastTransitionTime":"2025-11-25T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.547655 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.547718 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.547736 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.547764 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.547787 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:27Z","lastTransitionTime":"2025-11-25T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.651118 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.651905 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.651984 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.652063 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.652131 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:27Z","lastTransitionTime":"2025-11-25T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.756258 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.756303 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.756317 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.756343 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.756357 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:27Z","lastTransitionTime":"2025-11-25T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.812325 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:27 crc kubenswrapper[4693]: E1125 12:09:27.812638 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.860038 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.860095 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.860112 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.860137 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.860157 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:27Z","lastTransitionTime":"2025-11-25T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.963284 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.963423 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.963443 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.963473 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:27 crc kubenswrapper[4693]: I1125 12:09:27.963494 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:27Z","lastTransitionTime":"2025-11-25T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.065689 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.065744 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.065753 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.065770 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.065780 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.168423 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.168479 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.168494 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.168513 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.168526 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.271128 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.271223 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.271241 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.271267 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.271285 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.375204 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.375304 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.375332 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.375365 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.375433 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.477943 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.477978 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.477986 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.477999 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.478007 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.581953 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.582029 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.582049 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.582662 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.582727 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.685798 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.685855 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.685871 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.685893 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.685910 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.788243 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.788305 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.788315 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.788328 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.788338 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.811836 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.811892 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:28 crc kubenswrapper[4693]: E1125 12:09:28.811950 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.811967 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:28 crc kubenswrapper[4693]: E1125 12:09:28.812126 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:28 crc kubenswrapper[4693]: E1125 12:09:28.812166 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.873831 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.873890 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.873899 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.873913 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.873926 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: E1125 12:09:28.887112 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.896155 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.896217 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.896232 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.896255 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.896271 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: E1125 12:09:28.939006 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.944486 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.944587 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.944603 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.944630 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.944651 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: E1125 12:09:28.964684 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.970315 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.970409 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.970429 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.970453 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.970470 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:28 crc kubenswrapper[4693]: E1125 12:09:28.986824 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:28Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.991731 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.991790 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.991807 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.991829 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:28 crc kubenswrapper[4693]: I1125 12:09:28.991843 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:28Z","lastTransitionTime":"2025-11-25T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:29 crc kubenswrapper[4693]: E1125 12:09:29.004852 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5664b910-f808-4ca5-913a-47b9ca069334\\\",\\\"systemUUID\\\":\\\"7a8e2f4e-48fc-44c0-960c-1aabd39a7dcc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:29Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:29 crc kubenswrapper[4693]: E1125 12:09:29.004980 4693 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.007163 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.007194 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.007203 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.007232 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.007244 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:29Z","lastTransitionTime":"2025-11-25T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.110431 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.110478 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.110488 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.110504 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.110516 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:29Z","lastTransitionTime":"2025-11-25T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.213831 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.213895 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.213908 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.213930 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.213949 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:29Z","lastTransitionTime":"2025-11-25T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.316546 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.316611 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.316622 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.316641 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.316655 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:29Z","lastTransitionTime":"2025-11-25T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.419582 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.419642 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.419655 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.419677 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.419689 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:29Z","lastTransitionTime":"2025-11-25T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.522120 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.522176 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.522188 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.522206 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.522218 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:29Z","lastTransitionTime":"2025-11-25T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.625977 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.626045 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.626059 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.626079 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.626092 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:29Z","lastTransitionTime":"2025-11-25T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.728681 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.728756 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.728780 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.728811 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.728837 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:29Z","lastTransitionTime":"2025-11-25T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.812298 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:29 crc kubenswrapper[4693]: E1125 12:09:29.812673 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.831671 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.831750 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.831769 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.831799 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.831819 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:29Z","lastTransitionTime":"2025-11-25T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.935304 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.935402 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.935423 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.935454 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:29 crc kubenswrapper[4693]: I1125 12:09:29.935475 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:29Z","lastTransitionTime":"2025-11-25T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.038134 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.038197 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.038216 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.038241 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.038257 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:30Z","lastTransitionTime":"2025-11-25T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.142293 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.142409 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.142434 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.142487 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.142508 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:30Z","lastTransitionTime":"2025-11-25T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.246040 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.246114 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.246132 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.246159 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.246176 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:30Z","lastTransitionTime":"2025-11-25T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.352916 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.352962 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.352977 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.353000 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.353016 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:30Z","lastTransitionTime":"2025-11-25T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.456138 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.456181 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.456193 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.456217 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.456229 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:30Z","lastTransitionTime":"2025-11-25T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.559843 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.559905 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.559922 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.559947 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.559966 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:30Z","lastTransitionTime":"2025-11-25T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.662855 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.662909 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.662928 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.662956 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.662975 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:30Z","lastTransitionTime":"2025-11-25T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.766170 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.766232 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.766250 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.766276 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.766296 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:30Z","lastTransitionTime":"2025-11-25T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.812656 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.812799 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:30 crc kubenswrapper[4693]: E1125 12:09:30.812997 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.813033 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:30 crc kubenswrapper[4693]: E1125 12:09:30.813450 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:30 crc kubenswrapper[4693]: E1125 12:09:30.813683 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.836254 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb1074b560366714a6f0048cca2f38b571c7fe0018019331def03c09e0d5d401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.861441 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.869848 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.869930 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.869964 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.869984 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.869998 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:30Z","lastTransitionTime":"2025-11-25T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.875882 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f238a1e7-499b-466f-b643-bef0ae6f5e5f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58791c19e688514c92e35ea107e7ec3685a74799045a6160d1ca22f80cbd3f6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pw97\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6d66d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.896542 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8271578-d9e5-4777-8689-da8dd38edfb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2077427d55f01ce5867183b07ddda96180767bc14e9f6a69cf9dea502a416ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac8873da5d67d05c814b083acf7e6bed1fa5187b16361963046f8b22d07dbac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08edad021f69e86d33401c14368923f612a18d9934bcacee2d137bf57c91d754\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6e21a5f2e5ba64ad51e70af98a301e3a9f5fbd1090e3981b030161e76236a64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://755d861b3a8db9ab380608b8c94cc18c61da0762ff7404af48caccaf60b58ee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://640168658f70becc5947611057b7c1e8ca6f6851957386c4b027515c66eb98ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a54058ae1f8d511c149bc0b6c6609a0ed9baaeb4d2b6758060ada9483033f6de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95qrg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5rpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.917849 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"39f22db8-dbb0-4fa7-9f01-cdcabb55a43f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9481e7a4f9a08d4c89701299eef3d517c748c23b9c5742014abb88ac7f25a1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198a46a9527deb5fe0001b3c3b169daa71747d4d1c326dae285ed2a44868e37d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2831df5952b5bc358ee108ee262b2c119af7c81a6d75b6a499927c1be036efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.938220 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81a409a0-b655-4293-a8e5-3a3a9b99851e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://362dbcf9a07d5b854e6c744b4126e78fc208a5ffe9e2c44b06dc26d385f821a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8f95d34fecbd00f74fbc3b7248dba5a159abcf10c4288c7e3e9fe7f7b574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9db0e9b444abde365d7205c10c7dde91e1a7fcd40e86e2736056b2cab9e47fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08494962668a4676957e4a81dc44d5b138320c117d427b654fa6f472f40c899f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dedba3e346bfab84dc70280fb1b58ada06b0785f63904e06f7b64eaf1922f368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70590ce7b6729b39fa059735a5bd8a0630e13ed2113fd476a8880da80c1f4a7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f979fb4f3ab6aec9f4f6051d8c8e9e7a2044156f9828b89d7920bd0949a057e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f46fc0f98af4fd7aa58832669bf116485b4fcfc0eecc25c6af9ae2ad967ac64b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.951640 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.965787 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc3f2fcf1337e69d463c1ad531d3af280343d4b9e2967551000a54cd86a068e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c01b1269420a49b1e23d5bc60b018956d014d61060cb2a7b9ac9fec023d3cbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.972963 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.973198 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.973453 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.973633 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.973813 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:30Z","lastTransitionTime":"2025-11-25T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.982252 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c426dd553100ad02f7241ac80b50fb4912b0b970635643a9457714f1b5364cd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:30 crc kubenswrapper[4693]: I1125 12:09:30.994010 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xl6bh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399a78ed-8e91-43cb-89c1-6732d17d6e41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4ae61af534704aa43cf287c6e6da32698eee5a341d4b9f97487090d177d580da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtj7j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xl6bh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:30Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.011361 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d0c76695-d437-4d1b-92e1-37b2b5b045f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04fa74907d1047eea84d8c21f6b1903e911bdf897e040b7c81b3e66e9bfd066f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d30380298384d04ac04affad25bee46d3e72b9b87f5933405e776c53ae5a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qnhdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9tf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.032968 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1125 12:08:14.643282 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1125 12:08:14.644843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-663771027/tls.crt::/tmp/serving-cert-663771027/tls.key\\\\\\\"\\\\nI1125 12:08:20.370261 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1125 12:08:20.375403 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1125 12:08:20.375425 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1125 12:08:20.375442 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1125 12:08:20.375446 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1125 12:08:20.427040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1125 12:08:20.427084 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1125 12:08:20.427105 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427151 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1125 12:08:20.427169 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1125 12:08:20.427185 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1125 12:08:20.427195 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1125 12:08:20.427204 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1125 12:08:20.430654 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.051145 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.076939 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.077000 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.077017 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.077041 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.077057 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:31Z","lastTransitionTime":"2025-11-25T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.083459 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c247f7d-6187-4052-baee-5c5841e1d9da\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:09:18Z\\\",\\\"message\\\":\\\"roller took 0.122850588 seconds. No OVN measurement.\\\\nI1125 12:09:18.094588 6701 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1125 12:09:18.094622 6701 factory.go:656] Stopping watch factory\\\\nI1125 12:09:18.094632 6701 ovnkube.go:599] Stopped ovnkube\\\\nI1125 12:09:18.094626 6701 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1125 12:09:18.094647 6701 handler.go:208] Removed *v1.Node event handler 2\\\\nI1125 12:09:18.094654 6701 services_controller.go:360] Finished syncing service metrics on namespace openshift-kube-storage-version-migrator-operator for network=default : 1.204986ms\\\\nI1125 12:09:18.094670 6701 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1125 12:09:18.094673 6701 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication/oauth-openshift\\\\\\\"}\\\\nI1125 12:09:18.094683 6701 services_controller.go:360] Finished syncing service oauth-openshift on namespace openshift-authentication for network=default : 8.646697ms\\\\nF1125 12:09:18.094730 6701 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:09:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dlfn9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sn9jm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.100520 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-n2f89" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a10eb19c-b500-4cf9-961d-1892ba67560a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2rjc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-n2f89\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.113865 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2gdxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c80180-d8e5-4615-bb4d-0cd9bea27923\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cd231644679f05b10355b3eaf0267bbf33cb901ece7451a8e5b07c8e5440a47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pjk62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2gdxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.131420 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6l9jx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f714b419-cf37-48b7-9b1a-d36291d788a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382211ae43e333d7bb7c5f1a1ab9556b12e5b61664925168b887ab596f56a486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-25T12:09:13Z\\\",\\\"message\\\":\\\"2025-11-25T12:08:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90\\\\n2025-11-25T12:08:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c8dd163c-0928-4965-98de-edb2f4dcdc90 to /host/opt/cni/bin/\\\\n2025-11-25T12:08:28Z [verbose] multus-daemon started\\\\n2025-11-25T12:08:28Z [verbose] Readiness Indicator file check\\\\n2025-11-25T12:09:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:09:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4q5dx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6l9jx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.146045 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0d8f418-ddc7-48e4-9d11-4567bc98232e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-25T12:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aad6fba70273e809f50e6a66fbb6fa507e315cef0b0c2b0fb6c635e306928d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7e72b10217a3f81425d1f0243df0f4a40ab73aba0e06403d000bfacf0b6a6ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08df000c8fd24ff58f995a96bc5bf8e665130996de15cf6d139575dcb8284002\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb697eedc715dbb8fd8b4e6f0902144046961086ae8580e761729ff9cd61295e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:08:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-25T12:08:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-25T12:09:31Z is after 2025-08-24T17:21:41Z" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.179566 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.179623 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.179640 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.179667 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.179684 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:31Z","lastTransitionTime":"2025-11-25T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.283082 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.283177 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.283233 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.283257 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.283310 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:31Z","lastTransitionTime":"2025-11-25T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.386476 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.386534 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.386551 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.386576 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.386593 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:31Z","lastTransitionTime":"2025-11-25T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.489779 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.489912 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.489933 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.489956 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.489974 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:31Z","lastTransitionTime":"2025-11-25T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.593044 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.593098 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.593115 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.593138 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.593155 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:31Z","lastTransitionTime":"2025-11-25T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.696063 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.696097 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.696107 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.696122 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.696133 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:31Z","lastTransitionTime":"2025-11-25T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.798633 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.798680 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.798696 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.798716 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.798728 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:31Z","lastTransitionTime":"2025-11-25T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.812038 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:31 crc kubenswrapper[4693]: E1125 12:09:31.812289 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.901575 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.901982 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.902133 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.902289 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:31 crc kubenswrapper[4693]: I1125 12:09:31.902522 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:31Z","lastTransitionTime":"2025-11-25T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.005072 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.005111 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.005120 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.005135 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.005145 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:32Z","lastTransitionTime":"2025-11-25T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.108497 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.108558 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.108575 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.108600 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.108620 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:32Z","lastTransitionTime":"2025-11-25T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.211466 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.211798 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.211916 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.212018 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.212101 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:32Z","lastTransitionTime":"2025-11-25T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.316238 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.316623 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.316785 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.316983 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.317117 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:32Z","lastTransitionTime":"2025-11-25T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.419248 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.419288 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.419297 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.419317 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.419329 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:32Z","lastTransitionTime":"2025-11-25T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.522707 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.522773 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.522792 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.522820 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.522841 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:32Z","lastTransitionTime":"2025-11-25T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.625254 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.625326 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.625355 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.625419 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.625443 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:32Z","lastTransitionTime":"2025-11-25T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.728626 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.728687 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.728709 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.728737 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.728755 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:32Z","lastTransitionTime":"2025-11-25T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.812685 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.812799 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:32 crc kubenswrapper[4693]: E1125 12:09:32.812851 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:32 crc kubenswrapper[4693]: E1125 12:09:32.812981 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.813722 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:32 crc kubenswrapper[4693]: E1125 12:09:32.813857 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.814037 4693 scope.go:117] "RemoveContainer" containerID="06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0" Nov 25 12:09:32 crc kubenswrapper[4693]: E1125 12:09:32.814348 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.832188 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.832233 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.832244 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.832263 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.832280 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:32Z","lastTransitionTime":"2025-11-25T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.935724 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.936213 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.936412 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.936593 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:32 crc kubenswrapper[4693]: I1125 12:09:32.936756 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:32Z","lastTransitionTime":"2025-11-25T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.039924 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.039974 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.039991 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.040019 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.040037 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:33Z","lastTransitionTime":"2025-11-25T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.143412 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.143452 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.143468 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.143492 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.143507 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:33Z","lastTransitionTime":"2025-11-25T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.246958 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.247020 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.247038 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.247067 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.247089 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:33Z","lastTransitionTime":"2025-11-25T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.351693 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.351760 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.351779 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.351805 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.351822 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:33Z","lastTransitionTime":"2025-11-25T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.454636 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.454695 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.454713 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.454742 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.454771 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:33Z","lastTransitionTime":"2025-11-25T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.557476 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.557541 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.557566 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.557596 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.557616 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:33Z","lastTransitionTime":"2025-11-25T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.660435 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.660509 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.660530 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.660550 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.660564 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:33Z","lastTransitionTime":"2025-11-25T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.767164 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.767255 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.767281 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.767330 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.767441 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:33Z","lastTransitionTime":"2025-11-25T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.812695 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:33 crc kubenswrapper[4693]: E1125 12:09:33.812915 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.828589 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.871840 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.871898 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.871914 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.871938 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.871957 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:33Z","lastTransitionTime":"2025-11-25T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.975118 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.975186 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.975204 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.975230 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:33 crc kubenswrapper[4693]: I1125 12:09:33.975248 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:33Z","lastTransitionTime":"2025-11-25T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.078565 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.078668 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.078688 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.078716 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.078739 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:34Z","lastTransitionTime":"2025-11-25T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.182230 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.182298 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.182316 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.182343 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.182361 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:34Z","lastTransitionTime":"2025-11-25T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.285875 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.285951 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.285969 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.285995 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.286012 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:34Z","lastTransitionTime":"2025-11-25T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.390117 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.390184 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.390202 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.390228 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.390251 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:34Z","lastTransitionTime":"2025-11-25T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.493212 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.493266 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.493282 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.493309 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.493328 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:34Z","lastTransitionTime":"2025-11-25T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.596812 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.596878 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.596902 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.596935 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.596963 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:34Z","lastTransitionTime":"2025-11-25T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.701038 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.701110 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.701127 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.701154 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.701171 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:34Z","lastTransitionTime":"2025-11-25T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.804835 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.804888 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.804907 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.804931 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.804948 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:34Z","lastTransitionTime":"2025-11-25T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.812799 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.812861 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.812848 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:34 crc kubenswrapper[4693]: E1125 12:09:34.813040 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:34 crc kubenswrapper[4693]: E1125 12:09:34.813865 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:34 crc kubenswrapper[4693]: E1125 12:09:34.813959 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.908897 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.908961 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.908979 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.909012 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:34 crc kubenswrapper[4693]: I1125 12:09:34.909042 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:34Z","lastTransitionTime":"2025-11-25T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.011720 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.011779 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.011798 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.011825 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.011855 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:35Z","lastTransitionTime":"2025-11-25T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.114681 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.114736 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.114752 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.114785 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.114802 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:35Z","lastTransitionTime":"2025-11-25T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.217759 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.217824 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.217841 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.217865 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.217883 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:35Z","lastTransitionTime":"2025-11-25T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.320582 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.320627 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.320638 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.320657 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.320669 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:35Z","lastTransitionTime":"2025-11-25T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.423089 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.423148 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.423169 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.423244 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.423260 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:35Z","lastTransitionTime":"2025-11-25T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.525688 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.525758 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.525775 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.525801 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.525822 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:35Z","lastTransitionTime":"2025-11-25T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.629246 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.629438 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.629522 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.629562 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.629581 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:35Z","lastTransitionTime":"2025-11-25T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.732581 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.732627 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.732648 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.732676 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.732694 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:35Z","lastTransitionTime":"2025-11-25T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.812451 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:35 crc kubenswrapper[4693]: E1125 12:09:35.812702 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.836519 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.836588 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.836607 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.836638 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.836657 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:35Z","lastTransitionTime":"2025-11-25T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.939523 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.939594 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.939612 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.939639 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:35 crc kubenswrapper[4693]: I1125 12:09:35.939658 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:35Z","lastTransitionTime":"2025-11-25T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.043228 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.043291 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.043310 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.043337 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.043359 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:36Z","lastTransitionTime":"2025-11-25T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.147665 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.147780 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.147800 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.147835 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.147861 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:36Z","lastTransitionTime":"2025-11-25T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.251434 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.251478 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.251487 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.251504 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.251516 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:36Z","lastTransitionTime":"2025-11-25T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.354538 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.354605 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.354624 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.354652 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.354671 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:36Z","lastTransitionTime":"2025-11-25T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.458437 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.458495 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.458516 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.458544 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.458563 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:36Z","lastTransitionTime":"2025-11-25T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.561608 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.561656 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.561664 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.561680 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.561691 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:36Z","lastTransitionTime":"2025-11-25T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.665217 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.665327 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.665352 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.665417 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.665440 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:36Z","lastTransitionTime":"2025-11-25T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.768811 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.768859 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.768877 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.768900 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.768915 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:36Z","lastTransitionTime":"2025-11-25T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.811868 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.811930 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:36 crc kubenswrapper[4693]: E1125 12:09:36.812318 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:36 crc kubenswrapper[4693]: E1125 12:09:36.812711 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.811929 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:36 crc kubenswrapper[4693]: E1125 12:09:36.813087 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.871076 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.871134 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.871151 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.871176 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.871208 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:36Z","lastTransitionTime":"2025-11-25T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.975248 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.975645 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.975664 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.975692 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:36 crc kubenswrapper[4693]: I1125 12:09:36.975710 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:36Z","lastTransitionTime":"2025-11-25T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.079485 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.079553 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.079570 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.079594 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.079613 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:37Z","lastTransitionTime":"2025-11-25T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.183831 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.183898 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.183915 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.183939 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.183960 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:37Z","lastTransitionTime":"2025-11-25T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.288013 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.288079 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.288105 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.288130 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.288148 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:37Z","lastTransitionTime":"2025-11-25T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.390311 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.390362 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.390392 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.390414 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.390426 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:37Z","lastTransitionTime":"2025-11-25T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.493901 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.493963 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.493982 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.494007 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.494024 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:37Z","lastTransitionTime":"2025-11-25T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.597516 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.597614 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.597632 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.597659 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.597680 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:37Z","lastTransitionTime":"2025-11-25T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.700807 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.700847 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.700858 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.700877 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.700890 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:37Z","lastTransitionTime":"2025-11-25T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.804634 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.804717 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.804741 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.804789 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.804812 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:37Z","lastTransitionTime":"2025-11-25T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.812023 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:37 crc kubenswrapper[4693]: E1125 12:09:37.812229 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.907578 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.907617 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.907629 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.907646 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:37 crc kubenswrapper[4693]: I1125 12:09:37.907659 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:37Z","lastTransitionTime":"2025-11-25T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.011295 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.011399 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.011445 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.011474 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.011496 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:38Z","lastTransitionTime":"2025-11-25T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.115549 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.115616 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.115636 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.115663 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.115682 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:38Z","lastTransitionTime":"2025-11-25T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.219856 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.219929 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.219946 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.219970 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.219985 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:38Z","lastTransitionTime":"2025-11-25T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.323420 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.323486 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.323510 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.323541 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.323571 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:38Z","lastTransitionTime":"2025-11-25T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.426603 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.426662 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.426674 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.426693 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.426704 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:38Z","lastTransitionTime":"2025-11-25T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.531419 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.531465 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.531477 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.531495 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.531509 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:38Z","lastTransitionTime":"2025-11-25T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.634283 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.634321 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.634328 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.634341 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.634351 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:38Z","lastTransitionTime":"2025-11-25T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.737847 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.737887 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.737898 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.737917 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.737929 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:38Z","lastTransitionTime":"2025-11-25T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.812712 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.812763 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.812809 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:38 crc kubenswrapper[4693]: E1125 12:09:38.812960 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:38 crc kubenswrapper[4693]: E1125 12:09:38.813054 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:38 crc kubenswrapper[4693]: E1125 12:09:38.813189 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.839807 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.839855 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.839873 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.839894 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.839911 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:38Z","lastTransitionTime":"2025-11-25T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.943190 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.943289 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.943308 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.943332 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:38 crc kubenswrapper[4693]: I1125 12:09:38.943396 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:38Z","lastTransitionTime":"2025-11-25T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.047030 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.047085 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.047101 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.047124 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.047140 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:39Z","lastTransitionTime":"2025-11-25T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.155822 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.155926 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.155962 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.155998 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.156021 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:39Z","lastTransitionTime":"2025-11-25T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.205935 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.205998 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.206014 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.206039 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.206058 4693 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-25T12:09:39Z","lastTransitionTime":"2025-11-25T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.298063 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9"] Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.298819 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.302974 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.304956 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.308538 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.308667 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.359546 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12ddd224-7102-414e-8d24-f2640fc5b0f1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.359677 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ddd224-7102-414e-8d24-f2640fc5b0f1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.359736 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12ddd224-7102-414e-8d24-f2640fc5b0f1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.359896 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/12ddd224-7102-414e-8d24-f2640fc5b0f1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.359988 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/12ddd224-7102-414e-8d24-f2640fc5b0f1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.359854 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=47.359820598 podStartE2EDuration="47.359820598s" podCreationTimestamp="2025-11-25 12:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:09:39.34770717 +0000 UTC m=+99.265792591" watchObservedRunningTime="2025-11-25 12:09:39.359820598 +0000 UTC m=+99.277906019" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.377900 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=6.377870991 podStartE2EDuration="6.377870991s" podCreationTimestamp="2025-11-25 12:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:09:39.360461147 +0000 UTC m=+99.278546578" watchObservedRunningTime="2025-11-25 12:09:39.377870991 +0000 UTC m=+99.295956412" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.401922 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6l9jx" podStartSLOduration=74.401838889 podStartE2EDuration="1m14.401838889s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:09:39.401099248 +0000 UTC m=+99.319184679" watchObservedRunningTime="2025-11-25 12:09:39.401838889 +0000 UTC m=+99.319924310" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.402602 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2gdxx" podStartSLOduration=74.402583212 podStartE2EDuration="1m14.402583212s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:09:39.379788688 +0000 UTC m=+99.297874109" watchObservedRunningTime="2025-11-25 12:09:39.402583212 +0000 UTC m=+99.320668633" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.421364 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podStartSLOduration=74.421315415 podStartE2EDuration="1m14.421315415s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:09:39.420211522 +0000 UTC m=+99.338296943" watchObservedRunningTime="2025-11-25 12:09:39.421315415 +0000 UTC m=+99.339400826" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.453087 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5rpbd" podStartSLOduration=74.453064423 podStartE2EDuration="1m14.453064423s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:09:39.450785525 +0000 UTC m=+99.368870936" watchObservedRunningTime="2025-11-25 12:09:39.453064423 +0000 UTC m=+99.371149814" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.460744 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/12ddd224-7102-414e-8d24-f2640fc5b0f1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.460822 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12ddd224-7102-414e-8d24-f2640fc5b0f1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.460875 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ddd224-7102-414e-8d24-f2640fc5b0f1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.460909 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12ddd224-7102-414e-8d24-f2640fc5b0f1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.460995 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/12ddd224-7102-414e-8d24-f2640fc5b0f1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.461087 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/12ddd224-7102-414e-8d24-f2640fc5b0f1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.461147 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/12ddd224-7102-414e-8d24-f2640fc5b0f1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.463223 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12ddd224-7102-414e-8d24-f2640fc5b0f1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.473727 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12ddd224-7102-414e-8d24-f2640fc5b0f1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.475563 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=79.475542597 podStartE2EDuration="1m19.475542597s" podCreationTimestamp="2025-11-25 12:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:09:39.475265749 +0000 UTC m=+99.393351150" watchObservedRunningTime="2025-11-25 12:09:39.475542597 +0000 UTC m=+99.393627978" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.480474 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12ddd224-7102-414e-8d24-f2640fc5b0f1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-h7cp9\" (UID: \"12ddd224-7102-414e-8d24-f2640fc5b0f1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.518524 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=75.518501096 podStartE2EDuration="1m15.518501096s" podCreationTimestamp="2025-11-25 12:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:09:39.517071344 +0000 UTC m=+99.435156765" watchObservedRunningTime="2025-11-25 12:09:39.518501096 +0000 UTC m=+99.436586487" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.590353 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xl6bh" podStartSLOduration=74.590331138 podStartE2EDuration="1m14.590331138s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:09:39.58939861 +0000 UTC m=+99.507484021" watchObservedRunningTime="2025-11-25 12:09:39.590331138 +0000 UTC m=+99.508416529" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.604760 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9tf" podStartSLOduration=73.604728323 podStartE2EDuration="1m13.604728323s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:09:39.604580229 +0000 UTC m=+99.522665650" watchObservedRunningTime="2025-11-25 12:09:39.604728323 +0000 UTC m=+99.522813704" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.629105 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.629081653 podStartE2EDuration="1m19.629081653s" podCreationTimestamp="2025-11-25 12:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:09:39.628903258 +0000 UTC m=+99.546988719" watchObservedRunningTime="2025-11-25 12:09:39.629081653 +0000 UTC m=+99.547167074" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.656175 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" Nov 25 12:09:39 crc kubenswrapper[4693]: I1125 12:09:39.812725 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:39 crc kubenswrapper[4693]: E1125 12:09:39.813447 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:40 crc kubenswrapper[4693]: I1125 12:09:40.363249 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" event={"ID":"12ddd224-7102-414e-8d24-f2640fc5b0f1","Type":"ContainerStarted","Data":"1ee525e1a8dec8dd3ff28769fca065009dc4508783d75e9931dec60ba08bc9ab"} Nov 25 12:09:40 crc kubenswrapper[4693]: I1125 12:09:40.363322 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" event={"ID":"12ddd224-7102-414e-8d24-f2640fc5b0f1","Type":"ContainerStarted","Data":"98bc8eb59a96092ac073f724c32b58549a1a72b44c314139457fc33ca0c4a4f5"} Nov 25 12:09:40 crc kubenswrapper[4693]: I1125 12:09:40.383477 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-h7cp9" podStartSLOduration=75.383452669 podStartE2EDuration="1m15.383452669s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:09:40.382420939 +0000 UTC m=+100.300506360" watchObservedRunningTime="2025-11-25 12:09:40.383452669 +0000 UTC m=+100.301538080" Nov 25 12:09:40 crc kubenswrapper[4693]: I1125 12:09:40.812587 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:40 crc kubenswrapper[4693]: I1125 12:09:40.812623 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:40 crc kubenswrapper[4693]: I1125 12:09:40.812681 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:40 crc kubenswrapper[4693]: E1125 12:09:40.814001 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:40 crc kubenswrapper[4693]: E1125 12:09:40.814088 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:40 crc kubenswrapper[4693]: E1125 12:09:40.814181 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:41 crc kubenswrapper[4693]: I1125 12:09:41.812186 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:41 crc kubenswrapper[4693]: E1125 12:09:41.812439 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:42 crc kubenswrapper[4693]: I1125 12:09:42.812739 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:42 crc kubenswrapper[4693]: E1125 12:09:42.812923 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:42 crc kubenswrapper[4693]: I1125 12:09:42.812744 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:42 crc kubenswrapper[4693]: E1125 12:09:42.813154 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:42 crc kubenswrapper[4693]: I1125 12:09:42.813275 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:42 crc kubenswrapper[4693]: E1125 12:09:42.813352 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:43 crc kubenswrapper[4693]: I1125 12:09:43.812593 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:43 crc kubenswrapper[4693]: E1125 12:09:43.812879 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:44 crc kubenswrapper[4693]: I1125 12:09:44.218155 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:44 crc kubenswrapper[4693]: E1125 12:09:44.218481 4693 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:09:44 crc kubenswrapper[4693]: E1125 12:09:44.218600 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs podName:a10eb19c-b500-4cf9-961d-1892ba67560a nodeName:}" failed. No retries permitted until 2025-11-25 12:10:48.218564 +0000 UTC m=+168.136649421 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs") pod "network-metrics-daemon-n2f89" (UID: "a10eb19c-b500-4cf9-961d-1892ba67560a") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 25 12:09:44 crc kubenswrapper[4693]: I1125 12:09:44.812652 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:44 crc kubenswrapper[4693]: I1125 12:09:44.812674 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:44 crc kubenswrapper[4693]: E1125 12:09:44.813359 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:44 crc kubenswrapper[4693]: E1125 12:09:44.813764 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:44 crc kubenswrapper[4693]: I1125 12:09:44.813879 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:44 crc kubenswrapper[4693]: E1125 12:09:44.814063 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:45 crc kubenswrapper[4693]: I1125 12:09:45.812344 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:45 crc kubenswrapper[4693]: E1125 12:09:45.812517 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:46 crc kubenswrapper[4693]: I1125 12:09:46.811916 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:46 crc kubenswrapper[4693]: E1125 12:09:46.812116 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:46 crc kubenswrapper[4693]: I1125 12:09:46.812517 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:46 crc kubenswrapper[4693]: E1125 12:09:46.812694 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:46 crc kubenswrapper[4693]: I1125 12:09:46.813003 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:46 crc kubenswrapper[4693]: E1125 12:09:46.813137 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:47 crc kubenswrapper[4693]: I1125 12:09:47.812285 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:47 crc kubenswrapper[4693]: I1125 12:09:47.812922 4693 scope.go:117] "RemoveContainer" containerID="06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0" Nov 25 12:09:47 crc kubenswrapper[4693]: E1125 12:09:47.813603 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sn9jm_openshift-ovn-kubernetes(4c247f7d-6187-4052-baee-5c5841e1d9da)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" Nov 25 12:09:47 crc kubenswrapper[4693]: E1125 12:09:47.813891 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:48 crc kubenswrapper[4693]: I1125 12:09:48.812160 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:48 crc kubenswrapper[4693]: I1125 12:09:48.812161 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:48 crc kubenswrapper[4693]: E1125 12:09:48.812294 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:48 crc kubenswrapper[4693]: I1125 12:09:48.812472 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:48 crc kubenswrapper[4693]: E1125 12:09:48.812654 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:48 crc kubenswrapper[4693]: E1125 12:09:48.812781 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:49 crc kubenswrapper[4693]: I1125 12:09:49.812921 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:49 crc kubenswrapper[4693]: E1125 12:09:49.813182 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:50 crc kubenswrapper[4693]: I1125 12:09:50.812223 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:50 crc kubenswrapper[4693]: I1125 12:09:50.812222 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:50 crc kubenswrapper[4693]: E1125 12:09:50.812502 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:50 crc kubenswrapper[4693]: I1125 12:09:50.812635 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:50 crc kubenswrapper[4693]: E1125 12:09:50.813851 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:50 crc kubenswrapper[4693]: E1125 12:09:50.813985 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:51 crc kubenswrapper[4693]: I1125 12:09:51.812766 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:51 crc kubenswrapper[4693]: E1125 12:09:51.813164 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:52 crc kubenswrapper[4693]: I1125 12:09:52.812402 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:52 crc kubenswrapper[4693]: I1125 12:09:52.812461 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:52 crc kubenswrapper[4693]: I1125 12:09:52.812506 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:52 crc kubenswrapper[4693]: E1125 12:09:52.812552 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:52 crc kubenswrapper[4693]: E1125 12:09:52.812638 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:52 crc kubenswrapper[4693]: E1125 12:09:52.812708 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:53 crc kubenswrapper[4693]: I1125 12:09:53.812493 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:53 crc kubenswrapper[4693]: E1125 12:09:53.813566 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:54 crc kubenswrapper[4693]: I1125 12:09:54.812678 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:54 crc kubenswrapper[4693]: E1125 12:09:54.812878 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:54 crc kubenswrapper[4693]: I1125 12:09:54.813220 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:54 crc kubenswrapper[4693]: E1125 12:09:54.813348 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:54 crc kubenswrapper[4693]: I1125 12:09:54.813631 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:54 crc kubenswrapper[4693]: E1125 12:09:54.813773 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:55 crc kubenswrapper[4693]: I1125 12:09:55.812470 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:55 crc kubenswrapper[4693]: E1125 12:09:55.812609 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:56 crc kubenswrapper[4693]: I1125 12:09:56.812440 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:56 crc kubenswrapper[4693]: I1125 12:09:56.812547 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:56 crc kubenswrapper[4693]: I1125 12:09:56.812581 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:56 crc kubenswrapper[4693]: E1125 12:09:56.812646 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:56 crc kubenswrapper[4693]: E1125 12:09:56.812743 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:56 crc kubenswrapper[4693]: E1125 12:09:56.812909 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:57 crc kubenswrapper[4693]: I1125 12:09:57.812266 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:57 crc kubenswrapper[4693]: E1125 12:09:57.813798 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:09:58 crc kubenswrapper[4693]: I1125 12:09:58.812155 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:09:58 crc kubenswrapper[4693]: I1125 12:09:58.812581 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:09:58 crc kubenswrapper[4693]: E1125 12:09:58.813001 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:09:58 crc kubenswrapper[4693]: E1125 12:09:58.813097 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:09:58 crc kubenswrapper[4693]: I1125 12:09:58.812847 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:09:58 crc kubenswrapper[4693]: E1125 12:09:58.814214 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:09:59 crc kubenswrapper[4693]: I1125 12:09:59.814578 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:09:59 crc kubenswrapper[4693]: E1125 12:09:59.814850 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:10:00 crc kubenswrapper[4693]: I1125 12:10:00.438870 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6l9jx_f714b419-cf37-48b7-9b1a-d36291d788a0/kube-multus/1.log" Nov 25 12:10:00 crc kubenswrapper[4693]: I1125 12:10:00.439795 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6l9jx_f714b419-cf37-48b7-9b1a-d36291d788a0/kube-multus/0.log" Nov 25 12:10:00 crc kubenswrapper[4693]: I1125 12:10:00.439849 4693 generic.go:334] "Generic (PLEG): container finished" podID="f714b419-cf37-48b7-9b1a-d36291d788a0" containerID="382211ae43e333d7bb7c5f1a1ab9556b12e5b61664925168b887ab596f56a486" exitCode=1 Nov 25 12:10:00 crc kubenswrapper[4693]: I1125 12:10:00.439893 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6l9jx" event={"ID":"f714b419-cf37-48b7-9b1a-d36291d788a0","Type":"ContainerDied","Data":"382211ae43e333d7bb7c5f1a1ab9556b12e5b61664925168b887ab596f56a486"} Nov 25 12:10:00 crc kubenswrapper[4693]: I1125 12:10:00.439944 4693 scope.go:117] "RemoveContainer" containerID="79d56c95d0243a54e8fd86d758ae938c6973bd2ad0042ed105ac10dd2357aeec" Nov 25 12:10:00 crc kubenswrapper[4693]: I1125 12:10:00.440792 4693 scope.go:117] "RemoveContainer" containerID="382211ae43e333d7bb7c5f1a1ab9556b12e5b61664925168b887ab596f56a486" Nov 25 12:10:00 crc kubenswrapper[4693]: E1125 12:10:00.441197 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-6l9jx_openshift-multus(f714b419-cf37-48b7-9b1a-d36291d788a0)\"" pod="openshift-multus/multus-6l9jx" podUID="f714b419-cf37-48b7-9b1a-d36291d788a0" Nov 25 12:10:00 crc kubenswrapper[4693]: E1125 12:10:00.763602 4693 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 25 12:10:00 crc kubenswrapper[4693]: I1125 12:10:00.812353 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:00 crc kubenswrapper[4693]: I1125 12:10:00.812584 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:00 crc kubenswrapper[4693]: E1125 12:10:00.813909 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:10:00 crc kubenswrapper[4693]: I1125 12:10:00.813957 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:10:00 crc kubenswrapper[4693]: E1125 12:10:00.814103 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:10:00 crc kubenswrapper[4693]: E1125 12:10:00.814221 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:10:00 crc kubenswrapper[4693]: I1125 12:10:00.815190 4693 scope.go:117] "RemoveContainer" containerID="06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0" Nov 25 12:10:00 crc kubenswrapper[4693]: E1125 12:10:00.930705 4693 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 12:10:01 crc kubenswrapper[4693]: I1125 12:10:01.445097 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6l9jx_f714b419-cf37-48b7-9b1a-d36291d788a0/kube-multus/1.log" Nov 25 12:10:01 crc kubenswrapper[4693]: I1125 12:10:01.447264 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/3.log" Nov 25 12:10:01 crc kubenswrapper[4693]: I1125 12:10:01.450282 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerStarted","Data":"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883"} Nov 25 12:10:01 crc kubenswrapper[4693]: I1125 12:10:01.450861 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:10:01 crc kubenswrapper[4693]: I1125 12:10:01.493677 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podStartSLOduration=96.493653722 podStartE2EDuration="1m36.493653722s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:01.489838789 +0000 UTC m=+121.407924180" watchObservedRunningTime="2025-11-25 12:10:01.493653722 +0000 UTC m=+121.411739143" Nov 25 12:10:01 crc kubenswrapper[4693]: I1125 12:10:01.812747 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:01 crc kubenswrapper[4693]: E1125 12:10:01.813471 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:10:01 crc kubenswrapper[4693]: I1125 12:10:01.914567 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n2f89"] Nov 25 12:10:02 crc kubenswrapper[4693]: I1125 12:10:02.454288 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:02 crc kubenswrapper[4693]: E1125 12:10:02.454816 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:10:02 crc kubenswrapper[4693]: I1125 12:10:02.812121 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:02 crc kubenswrapper[4693]: I1125 12:10:02.812191 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:10:02 crc kubenswrapper[4693]: I1125 12:10:02.812145 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:02 crc kubenswrapper[4693]: E1125 12:10:02.812294 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:10:02 crc kubenswrapper[4693]: E1125 12:10:02.812339 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:10:02 crc kubenswrapper[4693]: E1125 12:10:02.812429 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:10:03 crc kubenswrapper[4693]: I1125 12:10:03.812653 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:03 crc kubenswrapper[4693]: E1125 12:10:03.812865 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:10:04 crc kubenswrapper[4693]: I1125 12:10:04.812477 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:04 crc kubenswrapper[4693]: I1125 12:10:04.812549 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:10:04 crc kubenswrapper[4693]: I1125 12:10:04.812688 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:04 crc kubenswrapper[4693]: E1125 12:10:04.812720 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:10:04 crc kubenswrapper[4693]: E1125 12:10:04.812848 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:10:04 crc kubenswrapper[4693]: E1125 12:10:04.812942 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:10:05 crc kubenswrapper[4693]: I1125 12:10:05.812041 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:05 crc kubenswrapper[4693]: E1125 12:10:05.812325 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:10:05 crc kubenswrapper[4693]: E1125 12:10:05.932657 4693 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 12:10:06 crc kubenswrapper[4693]: I1125 12:10:06.812264 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:06 crc kubenswrapper[4693]: I1125 12:10:06.812323 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:10:06 crc kubenswrapper[4693]: I1125 12:10:06.812264 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:06 crc kubenswrapper[4693]: E1125 12:10:06.812498 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:10:06 crc kubenswrapper[4693]: E1125 12:10:06.812597 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:10:06 crc kubenswrapper[4693]: E1125 12:10:06.812812 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:10:07 crc kubenswrapper[4693]: I1125 12:10:07.812588 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:07 crc kubenswrapper[4693]: E1125 12:10:07.812836 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:10:08 crc kubenswrapper[4693]: I1125 12:10:08.812581 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:10:08 crc kubenswrapper[4693]: I1125 12:10:08.812688 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:08 crc kubenswrapper[4693]: E1125 12:10:08.813104 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:10:08 crc kubenswrapper[4693]: E1125 12:10:08.813232 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:10:08 crc kubenswrapper[4693]: I1125 12:10:08.813738 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:08 crc kubenswrapper[4693]: E1125 12:10:08.813858 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:10:09 crc kubenswrapper[4693]: I1125 12:10:09.812690 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:09 crc kubenswrapper[4693]: E1125 12:10:09.812981 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:10:10 crc kubenswrapper[4693]: I1125 12:10:10.812501 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:10:10 crc kubenswrapper[4693]: E1125 12:10:10.813816 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:10:10 crc kubenswrapper[4693]: I1125 12:10:10.813966 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:10 crc kubenswrapper[4693]: I1125 12:10:10.813959 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:10 crc kubenswrapper[4693]: E1125 12:10:10.814154 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:10:10 crc kubenswrapper[4693]: E1125 12:10:10.814426 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:10:10 crc kubenswrapper[4693]: E1125 12:10:10.933112 4693 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 25 12:10:11 crc kubenswrapper[4693]: I1125 12:10:11.812014 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:11 crc kubenswrapper[4693]: E1125 12:10:11.812226 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:10:12 crc kubenswrapper[4693]: I1125 12:10:12.811857 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:12 crc kubenswrapper[4693]: I1125 12:10:12.811905 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:10:12 crc kubenswrapper[4693]: I1125 12:10:12.812028 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:12 crc kubenswrapper[4693]: E1125 12:10:12.812106 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:10:12 crc kubenswrapper[4693]: E1125 12:10:12.812347 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:10:12 crc kubenswrapper[4693]: I1125 12:10:12.812721 4693 scope.go:117] "RemoveContainer" containerID="382211ae43e333d7bb7c5f1a1ab9556b12e5b61664925168b887ab596f56a486" Nov 25 12:10:12 crc kubenswrapper[4693]: E1125 12:10:12.812754 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:10:13 crc kubenswrapper[4693]: I1125 12:10:13.497730 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6l9jx_f714b419-cf37-48b7-9b1a-d36291d788a0/kube-multus/1.log" Nov 25 12:10:13 crc kubenswrapper[4693]: I1125 12:10:13.498310 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6l9jx" event={"ID":"f714b419-cf37-48b7-9b1a-d36291d788a0","Type":"ContainerStarted","Data":"341e4691994213e289b6b6687d57d43f5c4a8981a11aa8daf3be474b270d87f7"} Nov 25 12:10:13 crc kubenswrapper[4693]: I1125 12:10:13.812626 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:13 crc kubenswrapper[4693]: E1125 12:10:13.812855 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:10:14 crc kubenswrapper[4693]: I1125 12:10:14.812323 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:14 crc kubenswrapper[4693]: I1125 12:10:14.812527 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:14 crc kubenswrapper[4693]: I1125 12:10:14.812779 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:10:14 crc kubenswrapper[4693]: E1125 12:10:14.812785 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 25 12:10:14 crc kubenswrapper[4693]: E1125 12:10:14.812910 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 25 12:10:14 crc kubenswrapper[4693]: E1125 12:10:14.813019 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 25 12:10:15 crc kubenswrapper[4693]: I1125 12:10:15.811925 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:15 crc kubenswrapper[4693]: E1125 12:10:15.812148 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2f89" podUID="a10eb19c-b500-4cf9-961d-1892ba67560a" Nov 25 12:10:16 crc kubenswrapper[4693]: I1125 12:10:16.814661 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:16 crc kubenswrapper[4693]: I1125 12:10:16.814735 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:16 crc kubenswrapper[4693]: I1125 12:10:16.814753 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:10:16 crc kubenswrapper[4693]: I1125 12:10:16.822195 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 25 12:10:16 crc kubenswrapper[4693]: I1125 12:10:16.822469 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 25 12:10:16 crc kubenswrapper[4693]: I1125 12:10:16.822509 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 25 12:10:16 crc kubenswrapper[4693]: I1125 12:10:16.822726 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 25 12:10:17 crc kubenswrapper[4693]: I1125 12:10:17.812420 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:17 crc kubenswrapper[4693]: I1125 12:10:17.817029 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 25 12:10:17 crc kubenswrapper[4693]: I1125 12:10:17.817120 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 25 12:10:18 crc kubenswrapper[4693]: I1125 12:10:18.355647 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.104517 4693 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.157825 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mkt5h"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.158810 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.161169 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9ljwp"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.161894 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.162871 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-c4h8g"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.163191 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.163473 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.163757 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.164175 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.164228 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.164331 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.164504 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.164646 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.166531 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.167301 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.168048 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.168785 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.168855 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.169241 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.170766 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.171486 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ptflj"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.172166 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.172629 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.172647 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.172773 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.172729 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.174682 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.180123 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.180334 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.180890 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.189174 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.202268 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-skxbw"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.202438 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.202869 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-52nbn"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.203041 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-skxbw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.203101 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.202860 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.203358 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.202983 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.203020 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.203568 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.203191 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.203754 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.203238 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.203861 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.203973 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.204083 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.204168 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.204169 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.204809 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.204828 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.205053 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.205180 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.205298 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.205394 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.205399 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.205507 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.205646 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.205679 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.205296 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.205789 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.205904 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.205969 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.206053 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.206116 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.206249 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.207583 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.224906 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.225584 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zm7pc"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.225932 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-4b2tf"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.226323 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.226459 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.226490 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.226498 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.226805 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.227028 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.227155 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.228078 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.228105 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.228692 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xtxw4"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.229123 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.229639 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.229789 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.229956 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.229990 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.230200 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.230269 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.230211 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.230439 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.230636 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.230650 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.231111 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mkt5h"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.233587 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ptflj"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.233655 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.236645 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7snhp"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.237049 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.237346 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7snhp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.237923 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.238041 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.241925 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.242566 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.242704 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.243148 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.243345 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.243579 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.243799 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.243930 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.244340 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.244790 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.244870 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.244975 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.245026 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.245042 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.245111 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.257395 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.263309 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9ljwp"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.263730 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.267125 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mpmjj"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.267853 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.285967 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.286257 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.287252 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mpmjj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.287707 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.287912 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.288068 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.288320 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.288410 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.288501 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.288742 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.289241 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.289579 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.290569 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.290657 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.290699 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.294533 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.294721 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.295133 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.295231 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.296212 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.297218 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.297689 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.297878 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.298978 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.299133 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.299286 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p8tpr"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.299742 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.300196 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.300618 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.301180 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.302763 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.309367 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q8bsw"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.309855 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.310272 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-b7p2s"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.310983 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.311201 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.312328 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.313168 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.313484 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.313856 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.314258 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.323034 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.323196 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-c4h8g"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.323243 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4b2tf"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.327651 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.340194 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xtxw4"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.341696 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cg5pd"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.342754 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.342769 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.342916 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.356559 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2ggsx"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.357205 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2ggsx" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.357640 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358011 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-config\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358038 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358081 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw26s\" (UniqueName: \"kubernetes.io/projected/ed425447-604d-40a0-969a-97645b617956-kube-api-access-xw26s\") pod \"openshift-controller-manager-operator-756b6f6bc6-7lrpg\" (UID: \"ed425447-604d-40a0-969a-97645b617956\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358103 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/872e0bc7-2c6e-43cb-98ca-18b85200e276-config\") pod \"kube-controller-manager-operator-78b949d7b-wwcrs\" (UID: \"872e0bc7-2c6e-43cb-98ca-18b85200e276\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358120 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjw6c\" (UniqueName: \"kubernetes.io/projected/75594753-ed30-49c9-b2ee-b63e64782ab3-kube-api-access-wjw6c\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358135 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-image-import-ca\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358152 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da1a8ea0-27fe-434b-9d60-641c1645b75b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358167 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-audit-policies\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358185 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7919df1-557b-4835-9a9b-680eac28f2c7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zkt9v\" (UID: \"f7919df1-557b-4835-9a9b-680eac28f2c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358202 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56882945-e452-4371-bd3b-9fb0e33a5de0-config\") pod \"machine-approver-56656f9798-7h4dt\" (UID: \"56882945-e452-4371-bd3b-9fb0e33a5de0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358217 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/da1a8ea0-27fe-434b-9d60-641c1645b75b-encryption-config\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358234 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358253 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358269 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07a0f64-0b4d-4719-9a5f-574120ad186a-config\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358284 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da1a8ea0-27fe-434b-9d60-641c1645b75b-serving-cert\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358298 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/da1a8ea0-27fe-434b-9d60-641c1645b75b-audit-dir\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358311 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/872e0bc7-2c6e-43cb-98ca-18b85200e276-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wwcrs\" (UID: \"872e0bc7-2c6e-43cb-98ca-18b85200e276\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358329 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7919df1-557b-4835-9a9b-680eac28f2c7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zkt9v\" (UID: \"f7919df1-557b-4835-9a9b-680eac28f2c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358345 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d07a0f64-0b4d-4719-9a5f-574120ad186a-service-ca-bundle\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358361 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s66fd\" (UniqueName: \"kubernetes.io/projected/56882945-e452-4371-bd3b-9fb0e33a5de0-kube-api-access-s66fd\") pod \"machine-approver-56656f9798-7h4dt\" (UID: \"56882945-e452-4371-bd3b-9fb0e33a5de0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358390 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549a8467-81c7-4195-9ead-f0cbdba07d61-config\") pod \"console-operator-58897d9998-xtxw4\" (UID: \"549a8467-81c7-4195-9ead-f0cbdba07d61\") " pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358407 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9q9g\" (UniqueName: \"kubernetes.io/projected/549a8467-81c7-4195-9ead-f0cbdba07d61-kube-api-access-s9q9g\") pod \"console-operator-58897d9998-xtxw4\" (UID: \"549a8467-81c7-4195-9ead-f0cbdba07d61\") " pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358423 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-client-ca\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358440 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358456 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/75594753-ed30-49c9-b2ee-b63e64782ab3-etcd-client\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358473 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-etcd-serving-ca\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358492 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18eaddfe-4ea4-4581-afe0-b778eb74ff49-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wj9xr\" (UID: \"18eaddfe-4ea4-4581-afe0-b778eb74ff49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358509 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdmm5\" (UniqueName: \"kubernetes.io/projected/9f4fda99-711f-4947-aae3-55186580a3cc-kube-api-access-mdmm5\") pod \"openshift-apiserver-operator-796bbdcf4f-2mf49\" (UID: \"9f4fda99-711f-4947-aae3-55186580a3cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358526 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-audit\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358544 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/75594753-ed30-49c9-b2ee-b63e64782ab3-encryption-config\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358563 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358580 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d07a0f64-0b4d-4719-9a5f-574120ad186a-serving-cert\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358598 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56882945-e452-4371-bd3b-9fb0e33a5de0-auth-proxy-config\") pod \"machine-approver-56656f9798-7h4dt\" (UID: \"56882945-e452-4371-bd3b-9fb0e33a5de0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358614 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2z9c\" (UniqueName: \"kubernetes.io/projected/413025b6-a706-4ad3-b920-2c9929ddaa0e-kube-api-access-x2z9c\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358633 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358653 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/75594753-ed30-49c9-b2ee-b63e64782ab3-audit-dir\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358673 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a21ef08-fa38-470a-a821-3f39a5d72f23-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q7dkt\" (UID: \"2a21ef08-fa38-470a-a821-3f39a5d72f23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358693 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed425447-604d-40a0-969a-97645b617956-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7lrpg\" (UID: \"ed425447-604d-40a0-969a-97645b617956\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358714 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/64d12306-ed10-4a16-8c2b-941bfafaa705-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ptflj\" (UID: \"64d12306-ed10-4a16-8c2b-941bfafaa705\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358736 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358756 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed425447-604d-40a0-969a-97645b617956-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7lrpg\" (UID: \"ed425447-604d-40a0-969a-97645b617956\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358775 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcdsr\" (UniqueName: \"kubernetes.io/projected/725c1b7d-81c5-4bbe-99b1-c53b93754feb-kube-api-access-jcdsr\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358791 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a21ef08-fa38-470a-a821-3f39a5d72f23-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q7dkt\" (UID: \"2a21ef08-fa38-470a-a821-3f39a5d72f23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358809 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e45bad22-7711-44b5-a425-cdc54a795feb-metrics-tls\") pod \"ingress-operator-5b745b69d9-qwhs4\" (UID: \"e45bad22-7711-44b5-a425-cdc54a795feb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358825 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6v5r\" (UniqueName: \"kubernetes.io/projected/1579bf21-c619-4e37-b0f4-f7b9727daaf5-kube-api-access-q6v5r\") pod \"dns-operator-744455d44c-7snhp\" (UID: \"1579bf21-c619-4e37-b0f4-f7b9727daaf5\") " pod="openshift-dns-operator/dns-operator-744455d44c-7snhp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358841 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da1a8ea0-27fe-434b-9d60-641c1645b75b-etcd-client\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358859 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-config\") pod \"route-controller-manager-6576b87f9c-slhjf\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358877 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bccd7dbe-e658-4ce4-be99-b6642a5bb498-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mkt5h\" (UID: \"bccd7dbe-e658-4ce4-be99-b6642a5bb498\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358892 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2st5n\" (UniqueName: \"kubernetes.io/projected/a9e1e257-9a52-475a-a5ec-cd6fa9449f24-kube-api-access-2st5n\") pod \"downloads-7954f5f757-skxbw\" (UID: \"a9e1e257-9a52-475a-a5ec-cd6fa9449f24\") " pod="openshift-console/downloads-7954f5f757-skxbw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358917 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/da1a8ea0-27fe-434b-9d60-641c1645b75b-audit-policies\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358934 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkkk5\" (UniqueName: \"kubernetes.io/projected/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-kube-api-access-dkkk5\") pod \"route-controller-manager-6576b87f9c-slhjf\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358949 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-trusted-ca-bundle\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358965 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a21ef08-fa38-470a-a821-3f39a5d72f23-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q7dkt\" (UID: \"2a21ef08-fa38-470a-a821-3f39a5d72f23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.358979 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c3bdbfb7-27fc-41d4-a157-36363c246c38-audit-dir\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359043 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359059 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bccd7dbe-e658-4ce4-be99-b6642a5bb498-images\") pod \"machine-api-operator-5694c8668f-mkt5h\" (UID: \"bccd7dbe-e658-4ce4-be99-b6642a5bb498\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359076 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8skz\" (UniqueName: \"kubernetes.io/projected/e45bad22-7711-44b5-a425-cdc54a795feb-kube-api-access-x8skz\") pod \"ingress-operator-5b745b69d9-qwhs4\" (UID: \"e45bad22-7711-44b5-a425-cdc54a795feb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359111 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-config\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359316 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-oauth-serving-cert\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359352 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmr8k\" (UniqueName: \"kubernetes.io/projected/18eaddfe-4ea4-4581-afe0-b778eb74ff49-kube-api-access-zmr8k\") pod \"cluster-samples-operator-665b6dd947-wj9xr\" (UID: \"18eaddfe-4ea4-4581-afe0-b778eb74ff49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359396 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d07a0f64-0b4d-4719-9a5f-574120ad186a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359416 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jz8k\" (UniqueName: \"kubernetes.io/projected/bccd7dbe-e658-4ce4-be99-b6642a5bb498-kube-api-access-9jz8k\") pod \"machine-api-operator-5694c8668f-mkt5h\" (UID: \"bccd7dbe-e658-4ce4-be99-b6642a5bb498\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359432 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359491 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e45bad22-7711-44b5-a425-cdc54a795feb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qwhs4\" (UID: \"e45bad22-7711-44b5-a425-cdc54a795feb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359512 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/da1a8ea0-27fe-434b-9d60-641c1645b75b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359540 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75594753-ed30-49c9-b2ee-b63e64782ab3-serving-cert\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359588 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413025b6-a706-4ad3-b920-2c9929ddaa0e-serving-cert\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359680 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhv29\" (UniqueName: \"kubernetes.io/projected/da1a8ea0-27fe-434b-9d60-641c1645b75b-kube-api-access-vhv29\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359718 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7919df1-557b-4835-9a9b-680eac28f2c7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zkt9v\" (UID: \"f7919df1-557b-4835-9a9b-680eac28f2c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359757 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/872e0bc7-2c6e-43cb-98ca-18b85200e276-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wwcrs\" (UID: \"872e0bc7-2c6e-43cb-98ca-18b85200e276\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359778 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359805 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xtdr\" (UniqueName: \"kubernetes.io/projected/d07a0f64-0b4d-4719-9a5f-574120ad186a-kube-api-access-6xtdr\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359849 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e45bad22-7711-44b5-a425-cdc54a795feb-trusted-ca\") pod \"ingress-operator-5b745b69d9-qwhs4\" (UID: \"e45bad22-7711-44b5-a425-cdc54a795feb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359882 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-serving-cert\") pod \"route-controller-manager-6576b87f9c-slhjf\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359901 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359933 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359956 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-oauth-config\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.359976 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-service-ca\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360036 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549a8467-81c7-4195-9ead-f0cbdba07d61-serving-cert\") pod \"console-operator-58897d9998-xtxw4\" (UID: \"549a8467-81c7-4195-9ead-f0cbdba07d61\") " pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360057 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wv4r\" (UniqueName: \"kubernetes.io/projected/c3bdbfb7-27fc-41d4-a157-36363c246c38-kube-api-access-8wv4r\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360082 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64d12306-ed10-4a16-8c2b-941bfafaa705-serving-cert\") pod \"openshift-config-operator-7777fb866f-ptflj\" (UID: \"64d12306-ed10-4a16-8c2b-941bfafaa705\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360102 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360138 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4fda99-711f-4947-aae3-55186580a3cc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2mf49\" (UID: \"9f4fda99-711f-4947-aae3-55186580a3cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360191 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/56882945-e452-4371-bd3b-9fb0e33a5de0-machine-approver-tls\") pod \"machine-approver-56656f9798-7h4dt\" (UID: \"56882945-e452-4371-bd3b-9fb0e33a5de0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360209 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-client-ca\") pod \"route-controller-manager-6576b87f9c-slhjf\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360229 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-config\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360248 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-serving-cert\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360268 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlpv6\" (UniqueName: \"kubernetes.io/projected/2a21ef08-fa38-470a-a821-3f39a5d72f23-kube-api-access-hlpv6\") pod \"cluster-image-registry-operator-dc59b4c8b-q7dkt\" (UID: \"2a21ef08-fa38-470a-a821-3f39a5d72f23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360338 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f4fda99-711f-4947-aae3-55186580a3cc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2mf49\" (UID: \"9f4fda99-711f-4947-aae3-55186580a3cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360361 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/549a8467-81c7-4195-9ead-f0cbdba07d61-trusted-ca\") pod \"console-operator-58897d9998-xtxw4\" (UID: \"549a8467-81c7-4195-9ead-f0cbdba07d61\") " pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360391 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75594753-ed30-49c9-b2ee-b63e64782ab3-node-pullsecrets\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360416 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkn5x\" (UniqueName: \"kubernetes.io/projected/64d12306-ed10-4a16-8c2b-941bfafaa705-kube-api-access-vkn5x\") pod \"openshift-config-operator-7777fb866f-ptflj\" (UID: \"64d12306-ed10-4a16-8c2b-941bfafaa705\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360431 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bccd7dbe-e658-4ce4-be99-b6642a5bb498-config\") pod \"machine-api-operator-5694c8668f-mkt5h\" (UID: \"bccd7dbe-e658-4ce4-be99-b6642a5bb498\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360455 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1579bf21-c619-4e37-b0f4-f7b9727daaf5-metrics-tls\") pod \"dns-operator-744455d44c-7snhp\" (UID: \"1579bf21-c619-4e37-b0f4-f7b9727daaf5\") " pod="openshift-dns-operator/dns-operator-744455d44c-7snhp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.360850 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.361015 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.361711 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.362605 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8r9bv"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.363538 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.363626 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.364072 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.367311 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.368131 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.368826 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.369484 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.370092 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.370276 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.373344 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gstv2"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.374045 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.374307 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.375059 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.377989 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-n884p"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.378579 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.379239 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.379829 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.380395 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.380920 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.381888 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.382766 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7snhp"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.384577 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.387005 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.387844 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.390641 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mpmjj"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.392079 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zm7pc"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.393130 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p8tpr"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.394821 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.395758 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-52nbn"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.397153 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.398363 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.399637 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.400811 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.401451 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.402330 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-skxbw"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.403660 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.405551 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-b7p2s"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.406283 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.407430 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cg5pd"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.408514 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.411115 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gstv2"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.412194 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q8bsw"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.412751 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.414545 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.416903 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2ggsx"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.419464 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.420045 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.421150 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.421730 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.422494 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.423569 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jtvmf"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.429756 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.430417 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nckmj"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.439040 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.439169 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.440046 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jtvmf"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.441179 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nckmj"] Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.441547 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.461045 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.461336 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e45bad22-7711-44b5-a425-cdc54a795feb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qwhs4\" (UID: \"e45bad22-7711-44b5-a425-cdc54a795feb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.461444 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.461520 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pff5h\" (UniqueName: \"kubernetes.io/projected/e5113876-cf94-45b5-9edc-e4ac8af59cb9-kube-api-access-pff5h\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.461602 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75594753-ed30-49c9-b2ee-b63e64782ab3-serving-cert\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.461688 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/51398f58-1dab-4bf2-a7dc-b8669a515200-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h6fkp\" (UID: \"51398f58-1dab-4bf2-a7dc-b8669a515200\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.461761 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699627e5-fe61-4db8-885c-07ef0e8fb8fc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hnh5z\" (UID: \"699627e5-fe61-4db8-885c-07ef0e8fb8fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.461833 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m42r\" (UniqueName: \"kubernetes.io/projected/b473ce6c-f37a-472a-a1f2-89332034cdee-kube-api-access-4m42r\") pod \"marketplace-operator-79b997595-q8bsw\" (UID: \"b473ce6c-f37a-472a-a1f2-89332034cdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.461921 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7919df1-557b-4835-9a9b-680eac28f2c7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zkt9v\" (UID: \"f7919df1-557b-4835-9a9b-680eac28f2c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.461991 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-serving-cert\") pod \"route-controller-manager-6576b87f9c-slhjf\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.462080 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.462153 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a1d076-b5f5-41c5-89d5-a6975b170f07-serving-cert\") pod \"service-ca-operator-777779d784-8jmkl\" (UID: \"d1a1d076-b5f5-41c5-89d5-a6975b170f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.462226 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e45bad22-7711-44b5-a425-cdc54a795feb-trusted-ca\") pod \"ingress-operator-5b745b69d9-qwhs4\" (UID: \"e45bad22-7711-44b5-a425-cdc54a795feb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.462311 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.462447 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-service-ca\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.462530 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wv4r\" (UniqueName: \"kubernetes.io/projected/c3bdbfb7-27fc-41d4-a157-36363c246c38-kube-api-access-8wv4r\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.462606 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64d12306-ed10-4a16-8c2b-941bfafaa705-serving-cert\") pod \"openshift-config-operator-7777fb866f-ptflj\" (UID: \"64d12306-ed10-4a16-8c2b-941bfafaa705\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.462676 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p25dg\" (UniqueName: \"kubernetes.io/projected/d1a1d076-b5f5-41c5-89d5-a6975b170f07-kube-api-access-p25dg\") pod \"service-ca-operator-777779d784-8jmkl\" (UID: \"d1a1d076-b5f5-41c5-89d5-a6975b170f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.462818 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b473ce6c-f37a-472a-a1f2-89332034cdee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q8bsw\" (UID: \"b473ce6c-f37a-472a-a1f2-89332034cdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.462903 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-client-ca\") pod \"route-controller-manager-6576b87f9c-slhjf\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.462978 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-config\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.463047 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-serving-cert\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.463122 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/549a8467-81c7-4195-9ead-f0cbdba07d61-trusted-ca\") pod \"console-operator-58897d9998-xtxw4\" (UID: \"549a8467-81c7-4195-9ead-f0cbdba07d61\") " pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.463191 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75594753-ed30-49c9-b2ee-b63e64782ab3-node-pullsecrets\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.463259 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5113876-cf94-45b5-9edc-e4ac8af59cb9-apiservice-cert\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.463327 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-service-ca\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.463637 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/75594753-ed30-49c9-b2ee-b63e64782ab3-node-pullsecrets\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.464247 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-client-ca\") pod \"route-controller-manager-6576b87f9c-slhjf\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.464310 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-config\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.463330 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f4fda99-711f-4947-aae3-55186580a3cc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2mf49\" (UID: \"9f4fda99-711f-4947-aae3-55186580a3cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.464706 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/12e72fe6-ed8c-4f53-a8ee-47e19a656342-profile-collector-cert\") pod \"catalog-operator-68c6474976-mk5xd\" (UID: \"12e72fe6-ed8c-4f53-a8ee-47e19a656342\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.464817 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkn5x\" (UniqueName: \"kubernetes.io/projected/64d12306-ed10-4a16-8c2b-941bfafaa705-kube-api-access-vkn5x\") pod \"openshift-config-operator-7777fb866f-ptflj\" (UID: \"64d12306-ed10-4a16-8c2b-941bfafaa705\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.465249 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bccd7dbe-e658-4ce4-be99-b6642a5bb498-config\") pod \"machine-api-operator-5694c8668f-mkt5h\" (UID: \"bccd7dbe-e658-4ce4-be99-b6642a5bb498\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.465984 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggjqt\" (UniqueName: \"kubernetes.io/projected/d05ddc98-953b-4b53-8027-ba54f58fdf70-kube-api-access-ggjqt\") pod \"machine-config-server-n884p\" (UID: \"d05ddc98-953b-4b53-8027-ba54f58fdf70\") " pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.465950 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bccd7dbe-e658-4ce4-be99-b6642a5bb498-config\") pod \"machine-api-operator-5694c8668f-mkt5h\" (UID: \"bccd7dbe-e658-4ce4-be99-b6642a5bb498\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.465783 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/549a8467-81c7-4195-9ead-f0cbdba07d61-trusted-ca\") pod \"console-operator-58897d9998-xtxw4\" (UID: \"549a8467-81c7-4195-9ead-f0cbdba07d61\") " pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466075 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1579bf21-c619-4e37-b0f4-f7b9727daaf5-metrics-tls\") pod \"dns-operator-744455d44c-7snhp\" (UID: \"1579bf21-c619-4e37-b0f4-f7b9727daaf5\") " pod="openshift-dns-operator/dns-operator-744455d44c-7snhp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466162 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-config\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466215 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw26s\" (UniqueName: \"kubernetes.io/projected/ed425447-604d-40a0-969a-97645b617956-kube-api-access-xw26s\") pod \"openshift-controller-manager-operator-756b6f6bc6-7lrpg\" (UID: \"ed425447-604d-40a0-969a-97645b617956\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466246 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/872e0bc7-2c6e-43cb-98ca-18b85200e276-config\") pod \"kube-controller-manager-operator-78b949d7b-wwcrs\" (UID: \"872e0bc7-2c6e-43cb-98ca-18b85200e276\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466275 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjw6c\" (UniqueName: \"kubernetes.io/projected/75594753-ed30-49c9-b2ee-b63e64782ab3-kube-api-access-wjw6c\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466314 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58-signing-key\") pod \"service-ca-9c57cc56f-b7p2s\" (UID: \"0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58\") " pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466413 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-audit-policies\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466441 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjm8w\" (UniqueName: \"kubernetes.io/projected/0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58-kube-api-access-pjm8w\") pod \"service-ca-9c57cc56f-b7p2s\" (UID: \"0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58\") " pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466470 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/667a5672-0a53-472e-a366-1b66dbbc2189-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7qd8t\" (UID: \"667a5672-0a53-472e-a366-1b66dbbc2189\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466502 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56882945-e452-4371-bd3b-9fb0e33a5de0-config\") pod \"machine-approver-56656f9798-7h4dt\" (UID: \"56882945-e452-4371-bd3b-9fb0e33a5de0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466527 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07a0f64-0b4d-4719-9a5f-574120ad186a-config\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466556 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d66c42e9-0aba-45bd-867f-6f905804b854-etcd-service-ca\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466581 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-service-ca-bundle\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466611 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/872e0bc7-2c6e-43cb-98ca-18b85200e276-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wwcrs\" (UID: \"872e0bc7-2c6e-43cb-98ca-18b85200e276\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466637 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-default-certificate\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466661 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-stats-auth\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466687 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da1a8ea0-27fe-434b-9d60-641c1645b75b-serving-cert\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466716 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/da1a8ea0-27fe-434b-9d60-641c1645b75b-audit-dir\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466748 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-client-ca\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466787 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466816 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7919df1-557b-4835-9a9b-680eac28f2c7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zkt9v\" (UID: \"f7919df1-557b-4835-9a9b-680eac28f2c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466846 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/75594753-ed30-49c9-b2ee-b63e64782ab3-etcd-client\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466861 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466871 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-etcd-serving-ca\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466900 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/12e72fe6-ed8c-4f53-a8ee-47e19a656342-srv-cert\") pod \"catalog-operator-68c6474976-mk5xd\" (UID: \"12e72fe6-ed8c-4f53-a8ee-47e19a656342\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.467017 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s66fd\" (UniqueName: \"kubernetes.io/projected/56882945-e452-4371-bd3b-9fb0e33a5de0-kube-api-access-s66fd\") pod \"machine-approver-56656f9798-7h4dt\" (UID: \"56882945-e452-4371-bd3b-9fb0e33a5de0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.467064 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549a8467-81c7-4195-9ead-f0cbdba07d61-config\") pod \"console-operator-58897d9998-xtxw4\" (UID: \"549a8467-81c7-4195-9ead-f0cbdba07d61\") " pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.467094 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18eaddfe-4ea4-4581-afe0-b778eb74ff49-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wj9xr\" (UID: \"18eaddfe-4ea4-4581-afe0-b778eb74ff49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.467122 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8thdg\" (UniqueName: \"kubernetes.io/projected/d66c42e9-0aba-45bd-867f-6f905804b854-kube-api-access-8thdg\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.467155 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdmm5\" (UniqueName: \"kubernetes.io/projected/9f4fda99-711f-4947-aae3-55186580a3cc-kube-api-access-mdmm5\") pod \"openshift-apiserver-operator-796bbdcf4f-2mf49\" (UID: \"9f4fda99-711f-4947-aae3-55186580a3cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.467194 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/75594753-ed30-49c9-b2ee-b63e64782ab3-encryption-config\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.467221 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1c77bff0-0391-4a39-890a-8a81b2924f91-images\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.467247 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56882945-e452-4371-bd3b-9fb0e33a5de0-auth-proxy-config\") pod \"machine-approver-56656f9798-7h4dt\" (UID: \"56882945-e452-4371-bd3b-9fb0e33a5de0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.467273 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/75594753-ed30-49c9-b2ee-b63e64782ab3-audit-dir\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.467513 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-audit-policies\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.467536 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed425447-604d-40a0-969a-97645b617956-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7lrpg\" (UID: \"ed425447-604d-40a0-969a-97645b617956\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.467633 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-config\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.466918 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/da1a8ea0-27fe-434b-9d60-641c1645b75b-audit-dir\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468053 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/64d12306-ed10-4a16-8c2b-941bfafaa705-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ptflj\" (UID: \"64d12306-ed10-4a16-8c2b-941bfafaa705\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468087 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcdsr\" (UniqueName: \"kubernetes.io/projected/725c1b7d-81c5-4bbe-99b1-c53b93754feb-kube-api-access-jcdsr\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468111 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a21ef08-fa38-470a-a821-3f39a5d72f23-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q7dkt\" (UID: \"2a21ef08-fa38-470a-a821-3f39a5d72f23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468134 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468161 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/667a5672-0a53-472e-a366-1b66dbbc2189-srv-cert\") pod \"olm-operator-6b444d44fb-7qd8t\" (UID: \"667a5672-0a53-472e-a366-1b66dbbc2189\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468180 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83eaa7e8-5eda-480d-bf44-9bd329c12e8d-proxy-tls\") pod \"machine-config-controller-84d6567774-qwmhz\" (UID: \"83eaa7e8-5eda-480d-bf44-9bd329c12e8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468200 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b473ce6c-f37a-472a-a1f2-89332034cdee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q8bsw\" (UID: \"b473ce6c-f37a-472a-a1f2-89332034cdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468222 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-config\") pod \"route-controller-manager-6576b87f9c-slhjf\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468241 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bccd7dbe-e658-4ce4-be99-b6642a5bb498-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mkt5h\" (UID: \"bccd7dbe-e658-4ce4-be99-b6642a5bb498\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468295 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f4fda99-711f-4947-aae3-55186580a3cc-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2mf49\" (UID: \"9f4fda99-711f-4947-aae3-55186580a3cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468469 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1c77bff0-0391-4a39-890a-8a81b2924f91-proxy-tls\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468502 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gprp7\" (UniqueName: \"kubernetes.io/projected/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-kube-api-access-gprp7\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468494 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468532 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da1a8ea0-27fe-434b-9d60-641c1645b75b-etcd-client\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468558 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-trusted-ca-bundle\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468573 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/75594753-ed30-49c9-b2ee-b63e64782ab3-audit-dir\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468583 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/da1a8ea0-27fe-434b-9d60-641c1645b75b-audit-policies\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468637 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkkk5\" (UniqueName: \"kubernetes.io/projected/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-kube-api-access-dkkk5\") pod \"route-controller-manager-6576b87f9c-slhjf\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468673 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468714 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d66c42e9-0aba-45bd-867f-6f905804b854-serving-cert\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468747 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c3bdbfb7-27fc-41d4-a157-36363c246c38-audit-dir\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468771 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8skz\" (UniqueName: \"kubernetes.io/projected/e45bad22-7711-44b5-a425-cdc54a795feb-kube-api-access-x8skz\") pod \"ingress-operator-5b745b69d9-qwhs4\" (UID: \"e45bad22-7711-44b5-a425-cdc54a795feb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468796 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-metrics-certs\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.468840 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-config\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.469106 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64d12306-ed10-4a16-8c2b-941bfafaa705-serving-cert\") pod \"openshift-config-operator-7777fb866f-ptflj\" (UID: \"64d12306-ed10-4a16-8c2b-941bfafaa705\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.469252 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/da1a8ea0-27fe-434b-9d60-641c1645b75b-audit-policies\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.469283 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da1a8ea0-27fe-434b-9d60-641c1645b75b-serving-cert\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.469824 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56882945-e452-4371-bd3b-9fb0e33a5de0-auth-proxy-config\") pod \"machine-approver-56656f9798-7h4dt\" (UID: \"56882945-e452-4371-bd3b-9fb0e33a5de0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.469951 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.470033 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56882945-e452-4371-bd3b-9fb0e33a5de0-config\") pod \"machine-approver-56656f9798-7h4dt\" (UID: \"56882945-e452-4371-bd3b-9fb0e33a5de0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.470283 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/549a8467-81c7-4195-9ead-f0cbdba07d61-config\") pod \"console-operator-58897d9998-xtxw4\" (UID: \"549a8467-81c7-4195-9ead-f0cbdba07d61\") " pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.470642 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/64d12306-ed10-4a16-8c2b-941bfafaa705-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ptflj\" (UID: \"64d12306-ed10-4a16-8c2b-941bfafaa705\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.471497 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c3bdbfb7-27fc-41d4-a157-36363c246c38-audit-dir\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.471677 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-config\") pod \"route-controller-manager-6576b87f9c-slhjf\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.471724 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18eaddfe-4ea4-4581-afe0-b778eb74ff49-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wj9xr\" (UID: \"18eaddfe-4ea4-4581-afe0-b778eb74ff49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.471785 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58-signing-cabundle\") pod \"service-ca-9c57cc56f-b7p2s\" (UID: \"0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58\") " pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.471835 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-oauth-serving-cert\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.471863 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmr8k\" (UniqueName: \"kubernetes.io/projected/18eaddfe-4ea4-4581-afe0-b778eb74ff49-kube-api-access-zmr8k\") pod \"cluster-samples-operator-665b6dd947-wj9xr\" (UID: \"18eaddfe-4ea4-4581-afe0-b778eb74ff49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.471928 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jz8k\" (UniqueName: \"kubernetes.io/projected/bccd7dbe-e658-4ce4-be99-b6642a5bb498-kube-api-access-9jz8k\") pod \"machine-api-operator-5694c8668f-mkt5h\" (UID: \"bccd7dbe-e658-4ce4-be99-b6642a5bb498\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.471956 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b4h6\" (UniqueName: \"kubernetes.io/projected/51398f58-1dab-4bf2-a7dc-b8669a515200-kube-api-access-6b4h6\") pod \"package-server-manager-789f6589d5-h6fkp\" (UID: \"51398f58-1dab-4bf2-a7dc-b8669a515200\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.471986 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d07a0f64-0b4d-4719-9a5f-574120ad186a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472048 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/da1a8ea0-27fe-434b-9d60-641c1645b75b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472080 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699627e5-fe61-4db8-885c-07ef0e8fb8fc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hnh5z\" (UID: \"699627e5-fe61-4db8-885c-07ef0e8fb8fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472093 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-etcd-serving-ca\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472107 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/559e62da-b780-4e38-95ee-379cc6066ffa-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v98gn\" (UID: \"559e62da-b780-4e38-95ee-379cc6066ffa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472142 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413025b6-a706-4ad3-b920-2c9929ddaa0e-serving-cert\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472170 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhv29\" (UniqueName: \"kubernetes.io/projected/da1a8ea0-27fe-434b-9d60-641c1645b75b-kube-api-access-vhv29\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472195 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/872e0bc7-2c6e-43cb-98ca-18b85200e276-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wwcrs\" (UID: \"872e0bc7-2c6e-43cb-98ca-18b85200e276\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472219 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472247 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xtdr\" (UniqueName: \"kubernetes.io/projected/d07a0f64-0b4d-4719-9a5f-574120ad186a-kube-api-access-6xtdr\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472333 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8nzl\" (UniqueName: \"kubernetes.io/projected/667a5672-0a53-472e-a366-1b66dbbc2189-kube-api-access-f8nzl\") pod \"olm-operator-6b444d44fb-7qd8t\" (UID: \"667a5672-0a53-472e-a366-1b66dbbc2189\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472361 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c3a5def-8ee1-4719-ab67-916e2cddc6c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gstv2\" (UID: \"5c3a5def-8ee1-4719-ab67-916e2cddc6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472402 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwx97\" (UniqueName: \"kubernetes.io/projected/5c3a5def-8ee1-4719-ab67-916e2cddc6c6-kube-api-access-lwx97\") pod \"multus-admission-controller-857f4d67dd-gstv2\" (UID: \"5c3a5def-8ee1-4719-ab67-916e2cddc6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472440 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-oauth-config\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472468 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549a8467-81c7-4195-9ead-f0cbdba07d61-serving-cert\") pod \"console-operator-58897d9998-xtxw4\" (UID: \"549a8467-81c7-4195-9ead-f0cbdba07d61\") " pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472498 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d05ddc98-953b-4b53-8027-ba54f58fdf70-node-bootstrap-token\") pod \"machine-config-server-n884p\" (UID: \"d05ddc98-953b-4b53-8027-ba54f58fdf70\") " pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472526 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1c77bff0-0391-4a39-890a-8a81b2924f91-auth-proxy-config\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472548 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-config\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472952 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/da1a8ea0-27fe-434b-9d60-641c1645b75b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.473044 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7919df1-557b-4835-9a9b-680eac28f2c7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zkt9v\" (UID: \"f7919df1-557b-4835-9a9b-680eac28f2c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.472553 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/438f718d-ef67-42c1-b624-7d69c5e6b13f-secret-volume\") pod \"collect-profiles-29401200-p77sg\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.473432 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4fda99-711f-4947-aae3-55186580a3cc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2mf49\" (UID: \"9f4fda99-711f-4947-aae3-55186580a3cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.470637 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-serving-cert\") pod \"route-controller-manager-6576b87f9c-slhjf\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.473585 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d07a0f64-0b4d-4719-9a5f-574120ad186a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.474119 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/56882945-e452-4371-bd3b-9fb0e33a5de0-machine-approver-tls\") pod \"machine-approver-56656f9798-7h4dt\" (UID: \"56882945-e452-4371-bd3b-9fb0e33a5de0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.474177 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.474209 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfw6p\" (UniqueName: \"kubernetes.io/projected/12e72fe6-ed8c-4f53-a8ee-47e19a656342-kube-api-access-vfw6p\") pod \"catalog-operator-68c6474976-mk5xd\" (UID: \"12e72fe6-ed8c-4f53-a8ee-47e19a656342\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.474412 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75594753-ed30-49c9-b2ee-b63e64782ab3-serving-cert\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.474906 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/75594753-ed30-49c9-b2ee-b63e64782ab3-etcd-client\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.475200 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/da1a8ea0-27fe-434b-9d60-641c1645b75b-etcd-client\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.475293 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.475294 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bccd7dbe-e658-4ce4-be99-b6642a5bb498-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mkt5h\" (UID: \"bccd7dbe-e658-4ce4-be99-b6642a5bb498\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.475481 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlpv6\" (UniqueName: \"kubernetes.io/projected/2a21ef08-fa38-470a-a821-3f39a5d72f23-kube-api-access-hlpv6\") pod \"cluster-image-registry-operator-dc59b4c8b-q7dkt\" (UID: \"2a21ef08-fa38-470a-a821-3f39a5d72f23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.475508 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/75594753-ed30-49c9-b2ee-b63e64782ab3-encryption-config\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.475558 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a1d076-b5f5-41c5-89d5-a6975b170f07-config\") pod \"service-ca-operator-777779d784-8jmkl\" (UID: \"d1a1d076-b5f5-41c5-89d5-a6975b170f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.475563 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed425447-604d-40a0-969a-97645b617956-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7lrpg\" (UID: \"ed425447-604d-40a0-969a-97645b617956\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.475660 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-oauth-serving-cert\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.475675 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/438f718d-ef67-42c1-b624-7d69c5e6b13f-config-volume\") pod \"collect-profiles-29401200-p77sg\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.475732 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8slc7\" (UniqueName: \"kubernetes.io/projected/83eaa7e8-5eda-480d-bf44-9bd329c12e8d-kube-api-access-8slc7\") pod \"machine-config-controller-84d6567774-qwmhz\" (UID: \"83eaa7e8-5eda-480d-bf44-9bd329c12e8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.475837 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.476026 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-oauth-config\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.476034 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.476101 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/559e62da-b780-4e38-95ee-379cc6066ffa-config\") pod \"kube-apiserver-operator-766d6c64bb-v98gn\" (UID: \"559e62da-b780-4e38-95ee-379cc6066ffa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.476183 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4fda99-711f-4947-aae3-55186580a3cc-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2mf49\" (UID: \"9f4fda99-711f-4947-aae3-55186580a3cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.476181 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7919df1-557b-4835-9a9b-680eac28f2c7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zkt9v\" (UID: \"f7919df1-557b-4835-9a9b-680eac28f2c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.476471 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.476621 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-image-import-ca\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.476876 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66c42e9-0aba-45bd-867f-6f905804b854-config\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.476972 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da1a8ea0-27fe-434b-9d60-641c1645b75b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.477120 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/da1a8ea0-27fe-434b-9d60-641c1645b75b-encryption-config\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.477200 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.477288 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.477361 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm8k7\" (UniqueName: \"kubernetes.io/projected/1c77bff0-0391-4a39-890a-8a81b2924f91-kube-api-access-hm8k7\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.477667 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d07a0f64-0b4d-4719-9a5f-574120ad186a-service-ca-bundle\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.477799 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plxqt\" (UniqueName: \"kubernetes.io/projected/699627e5-fe61-4db8-885c-07ef0e8fb8fc-kube-api-access-plxqt\") pod \"kube-storage-version-migrator-operator-b67b599dd-hnh5z\" (UID: \"699627e5-fe61-4db8-885c-07ef0e8fb8fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.477922 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9q9g\" (UniqueName: \"kubernetes.io/projected/549a8467-81c7-4195-9ead-f0cbdba07d61-kube-api-access-s9q9g\") pod \"console-operator-58897d9998-xtxw4\" (UID: \"549a8467-81c7-4195-9ead-f0cbdba07d61\") " pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.477217 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/549a8467-81c7-4195-9ead-f0cbdba07d61-serving-cert\") pod \"console-operator-58897d9998-xtxw4\" (UID: \"549a8467-81c7-4195-9ead-f0cbdba07d61\") " pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.476744 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413025b6-a706-4ad3-b920-2c9929ddaa0e-serving-cert\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.477675 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da1a8ea0-27fe-434b-9d60-641c1645b75b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.476839 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.478438 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-image-import-ca\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.478439 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.478441 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d07a0f64-0b4d-4719-9a5f-574120ad186a-service-ca-bundle\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.478676 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d05ddc98-953b-4b53-8027-ba54f58fdf70-certs\") pod \"machine-config-server-n884p\" (UID: \"d05ddc98-953b-4b53-8027-ba54f58fdf70\") " pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.478784 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-audit\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479461 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2z9c\" (UniqueName: \"kubernetes.io/projected/413025b6-a706-4ad3-b920-2c9929ddaa0e-kube-api-access-x2z9c\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479409 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-audit\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479169 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479569 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479679 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479717 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d07a0f64-0b4d-4719-9a5f-574120ad186a-serving-cert\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479748 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/559e62da-b780-4e38-95ee-379cc6066ffa-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v98gn\" (UID: \"559e62da-b780-4e38-95ee-379cc6066ffa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479787 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a21ef08-fa38-470a-a821-3f39a5d72f23-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q7dkt\" (UID: \"2a21ef08-fa38-470a-a821-3f39a5d72f23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479821 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed425447-604d-40a0-969a-97645b617956-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7lrpg\" (UID: \"ed425447-604d-40a0-969a-97645b617956\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479851 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6v5r\" (UniqueName: \"kubernetes.io/projected/1579bf21-c619-4e37-b0f4-f7b9727daaf5-kube-api-access-q6v5r\") pod \"dns-operator-744455d44c-7snhp\" (UID: \"1579bf21-c619-4e37-b0f4-f7b9727daaf5\") " pod="openshift-dns-operator/dns-operator-744455d44c-7snhp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479879 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tn76\" (UniqueName: \"kubernetes.io/projected/438f718d-ef67-42c1-b624-7d69c5e6b13f-kube-api-access-2tn76\") pod \"collect-profiles-29401200-p77sg\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479909 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d66c42e9-0aba-45bd-867f-6f905804b854-etcd-ca\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479934 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e45bad22-7711-44b5-a425-cdc54a795feb-metrics-tls\") pod \"ingress-operator-5b745b69d9-qwhs4\" (UID: \"e45bad22-7711-44b5-a425-cdc54a795feb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479964 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2st5n\" (UniqueName: \"kubernetes.io/projected/a9e1e257-9a52-475a-a5ec-cd6fa9449f24-kube-api-access-2st5n\") pod \"downloads-7954f5f757-skxbw\" (UID: \"a9e1e257-9a52-475a-a5ec-cd6fa9449f24\") " pod="openshift-console/downloads-7954f5f757-skxbw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.479989 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e5113876-cf94-45b5-9edc-e4ac8af59cb9-tmpfs\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.480013 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5113876-cf94-45b5-9edc-e4ac8af59cb9-webhook-cert\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.480059 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a21ef08-fa38-470a-a821-3f39a5d72f23-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q7dkt\" (UID: \"2a21ef08-fa38-470a-a821-3f39a5d72f23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.480077 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.480090 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bccd7dbe-e658-4ce4-be99-b6642a5bb498-images\") pod \"machine-api-operator-5694c8668f-mkt5h\" (UID: \"bccd7dbe-e658-4ce4-be99-b6642a5bb498\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.480547 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm2gm\" (UniqueName: \"kubernetes.io/projected/9c2633b3-7860-4862-85de-77bcc6732a6c-kube-api-access-dm2gm\") pod \"migrator-59844c95c7-mpmjj\" (UID: \"9c2633b3-7860-4862-85de-77bcc6732a6c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mpmjj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.480597 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d66c42e9-0aba-45bd-867f-6f905804b854-etcd-client\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.480637 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83eaa7e8-5eda-480d-bf44-9bd329c12e8d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qwmhz\" (UID: \"83eaa7e8-5eda-480d-bf44-9bd329c12e8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.480783 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-serving-cert\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.481044 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/56882945-e452-4371-bd3b-9fb0e33a5de0-machine-approver-tls\") pod \"machine-approver-56656f9798-7h4dt\" (UID: \"56882945-e452-4371-bd3b-9fb0e33a5de0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.481406 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1579bf21-c619-4e37-b0f4-f7b9727daaf5-metrics-tls\") pod \"dns-operator-744455d44c-7snhp\" (UID: \"1579bf21-c619-4e37-b0f4-f7b9727daaf5\") " pod="openshift-dns-operator/dns-operator-744455d44c-7snhp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.481408 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.481473 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed425447-604d-40a0-969a-97645b617956-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7lrpg\" (UID: \"ed425447-604d-40a0-969a-97645b617956\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.482145 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a21ef08-fa38-470a-a821-3f39a5d72f23-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-q7dkt\" (UID: \"2a21ef08-fa38-470a-a821-3f39a5d72f23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.482209 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.482522 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07a0f64-0b4d-4719-9a5f-574120ad186a-config\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.483116 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-trusted-ca-bundle\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.483611 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a21ef08-fa38-470a-a821-3f39a5d72f23-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-q7dkt\" (UID: \"2a21ef08-fa38-470a-a821-3f39a5d72f23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.483854 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bccd7dbe-e658-4ce4-be99-b6642a5bb498-images\") pod \"machine-api-operator-5694c8668f-mkt5h\" (UID: \"bccd7dbe-e658-4ce4-be99-b6642a5bb498\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.485844 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-client-ca\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.486562 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e45bad22-7711-44b5-a425-cdc54a795feb-metrics-tls\") pod \"ingress-operator-5b745b69d9-qwhs4\" (UID: \"e45bad22-7711-44b5-a425-cdc54a795feb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.491736 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/da1a8ea0-27fe-434b-9d60-641c1645b75b-encryption-config\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.491903 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d07a0f64-0b4d-4719-9a5f-574120ad186a-serving-cert\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.491912 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.496629 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75594753-ed30-49c9-b2ee-b63e64782ab3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.501659 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.507784 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7919df1-557b-4835-9a9b-680eac28f2c7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zkt9v\" (UID: \"f7919df1-557b-4835-9a9b-680eac28f2c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.530592 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.534263 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e45bad22-7711-44b5-a425-cdc54a795feb-trusted-ca\") pod \"ingress-operator-5b745b69d9-qwhs4\" (UID: \"e45bad22-7711-44b5-a425-cdc54a795feb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.540626 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.560482 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.580796 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581441 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58-signing-key\") pod \"service-ca-9c57cc56f-b7p2s\" (UID: \"0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58\") " pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581530 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjm8w\" (UniqueName: \"kubernetes.io/projected/0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58-kube-api-access-pjm8w\") pod \"service-ca-9c57cc56f-b7p2s\" (UID: \"0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58\") " pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581553 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/667a5672-0a53-472e-a366-1b66dbbc2189-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7qd8t\" (UID: \"667a5672-0a53-472e-a366-1b66dbbc2189\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581571 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d66c42e9-0aba-45bd-867f-6f905804b854-etcd-service-ca\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581588 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-service-ca-bundle\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581616 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-default-certificate\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581639 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-stats-auth\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581662 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/12e72fe6-ed8c-4f53-a8ee-47e19a656342-srv-cert\") pod \"catalog-operator-68c6474976-mk5xd\" (UID: \"12e72fe6-ed8c-4f53-a8ee-47e19a656342\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581697 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8thdg\" (UniqueName: \"kubernetes.io/projected/d66c42e9-0aba-45bd-867f-6f905804b854-kube-api-access-8thdg\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581733 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1c77bff0-0391-4a39-890a-8a81b2924f91-images\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581748 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/667a5672-0a53-472e-a366-1b66dbbc2189-srv-cert\") pod \"olm-operator-6b444d44fb-7qd8t\" (UID: \"667a5672-0a53-472e-a366-1b66dbbc2189\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581764 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83eaa7e8-5eda-480d-bf44-9bd329c12e8d-proxy-tls\") pod \"machine-config-controller-84d6567774-qwmhz\" (UID: \"83eaa7e8-5eda-480d-bf44-9bd329c12e8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581794 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b473ce6c-f37a-472a-a1f2-89332034cdee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q8bsw\" (UID: \"b473ce6c-f37a-472a-a1f2-89332034cdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581814 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gprp7\" (UniqueName: \"kubernetes.io/projected/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-kube-api-access-gprp7\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581838 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1c77bff0-0391-4a39-890a-8a81b2924f91-proxy-tls\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581878 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d66c42e9-0aba-45bd-867f-6f905804b854-serving-cert\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581913 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-metrics-certs\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.581967 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58-signing-cabundle\") pod \"service-ca-9c57cc56f-b7p2s\" (UID: \"0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58\") " pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582004 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b4h6\" (UniqueName: \"kubernetes.io/projected/51398f58-1dab-4bf2-a7dc-b8669a515200-kube-api-access-6b4h6\") pod \"package-server-manager-789f6589d5-h6fkp\" (UID: \"51398f58-1dab-4bf2-a7dc-b8669a515200\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582027 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699627e5-fe61-4db8-885c-07ef0e8fb8fc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hnh5z\" (UID: \"699627e5-fe61-4db8-885c-07ef0e8fb8fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582051 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/559e62da-b780-4e38-95ee-379cc6066ffa-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v98gn\" (UID: \"559e62da-b780-4e38-95ee-379cc6066ffa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582091 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c3a5def-8ee1-4719-ab67-916e2cddc6c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gstv2\" (UID: \"5c3a5def-8ee1-4719-ab67-916e2cddc6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582115 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwx97\" (UniqueName: \"kubernetes.io/projected/5c3a5def-8ee1-4719-ab67-916e2cddc6c6-kube-api-access-lwx97\") pod \"multus-admission-controller-857f4d67dd-gstv2\" (UID: \"5c3a5def-8ee1-4719-ab67-916e2cddc6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582141 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8nzl\" (UniqueName: \"kubernetes.io/projected/667a5672-0a53-472e-a366-1b66dbbc2189-kube-api-access-f8nzl\") pod \"olm-operator-6b444d44fb-7qd8t\" (UID: \"667a5672-0a53-472e-a366-1b66dbbc2189\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582160 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d05ddc98-953b-4b53-8027-ba54f58fdf70-node-bootstrap-token\") pod \"machine-config-server-n884p\" (UID: \"d05ddc98-953b-4b53-8027-ba54f58fdf70\") " pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582178 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1c77bff0-0391-4a39-890a-8a81b2924f91-auth-proxy-config\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582197 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/438f718d-ef67-42c1-b624-7d69c5e6b13f-secret-volume\") pod \"collect-profiles-29401200-p77sg\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582214 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfw6p\" (UniqueName: \"kubernetes.io/projected/12e72fe6-ed8c-4f53-a8ee-47e19a656342-kube-api-access-vfw6p\") pod \"catalog-operator-68c6474976-mk5xd\" (UID: \"12e72fe6-ed8c-4f53-a8ee-47e19a656342\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582231 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a1d076-b5f5-41c5-89d5-a6975b170f07-config\") pod \"service-ca-operator-777779d784-8jmkl\" (UID: \"d1a1d076-b5f5-41c5-89d5-a6975b170f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582256 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/438f718d-ef67-42c1-b624-7d69c5e6b13f-config-volume\") pod \"collect-profiles-29401200-p77sg\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582272 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8slc7\" (UniqueName: \"kubernetes.io/projected/83eaa7e8-5eda-480d-bf44-9bd329c12e8d-kube-api-access-8slc7\") pod \"machine-config-controller-84d6567774-qwmhz\" (UID: \"83eaa7e8-5eda-480d-bf44-9bd329c12e8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582300 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/559e62da-b780-4e38-95ee-379cc6066ffa-config\") pod \"kube-apiserver-operator-766d6c64bb-v98gn\" (UID: \"559e62da-b780-4e38-95ee-379cc6066ffa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582319 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66c42e9-0aba-45bd-867f-6f905804b854-config\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582336 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm8k7\" (UniqueName: \"kubernetes.io/projected/1c77bff0-0391-4a39-890a-8a81b2924f91-kube-api-access-hm8k7\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582362 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plxqt\" (UniqueName: \"kubernetes.io/projected/699627e5-fe61-4db8-885c-07ef0e8fb8fc-kube-api-access-plxqt\") pod \"kube-storage-version-migrator-operator-b67b599dd-hnh5z\" (UID: \"699627e5-fe61-4db8-885c-07ef0e8fb8fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582401 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d05ddc98-953b-4b53-8027-ba54f58fdf70-certs\") pod \"machine-config-server-n884p\" (UID: \"d05ddc98-953b-4b53-8027-ba54f58fdf70\") " pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582419 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/559e62da-b780-4e38-95ee-379cc6066ffa-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v98gn\" (UID: \"559e62da-b780-4e38-95ee-379cc6066ffa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582450 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tn76\" (UniqueName: \"kubernetes.io/projected/438f718d-ef67-42c1-b624-7d69c5e6b13f-kube-api-access-2tn76\") pod \"collect-profiles-29401200-p77sg\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582487 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d66c42e9-0aba-45bd-867f-6f905804b854-etcd-ca\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582534 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e5113876-cf94-45b5-9edc-e4ac8af59cb9-tmpfs\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582576 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5113876-cf94-45b5-9edc-e4ac8af59cb9-webhook-cert\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582604 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83eaa7e8-5eda-480d-bf44-9bd329c12e8d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qwmhz\" (UID: \"83eaa7e8-5eda-480d-bf44-9bd329c12e8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582629 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm2gm\" (UniqueName: \"kubernetes.io/projected/9c2633b3-7860-4862-85de-77bcc6732a6c-kube-api-access-dm2gm\") pod \"migrator-59844c95c7-mpmjj\" (UID: \"9c2633b3-7860-4862-85de-77bcc6732a6c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mpmjj" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582650 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d66c42e9-0aba-45bd-867f-6f905804b854-etcd-client\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582686 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pff5h\" (UniqueName: \"kubernetes.io/projected/e5113876-cf94-45b5-9edc-e4ac8af59cb9-kube-api-access-pff5h\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582709 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699627e5-fe61-4db8-885c-07ef0e8fb8fc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hnh5z\" (UID: \"699627e5-fe61-4db8-885c-07ef0e8fb8fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582733 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/51398f58-1dab-4bf2-a7dc-b8669a515200-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h6fkp\" (UID: \"51398f58-1dab-4bf2-a7dc-b8669a515200\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582774 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m42r\" (UniqueName: \"kubernetes.io/projected/b473ce6c-f37a-472a-a1f2-89332034cdee-kube-api-access-4m42r\") pod \"marketplace-operator-79b997595-q8bsw\" (UID: \"b473ce6c-f37a-472a-a1f2-89332034cdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582815 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a1d076-b5f5-41c5-89d5-a6975b170f07-serving-cert\") pod \"service-ca-operator-777779d784-8jmkl\" (UID: \"d1a1d076-b5f5-41c5-89d5-a6975b170f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582874 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p25dg\" (UniqueName: \"kubernetes.io/projected/d1a1d076-b5f5-41c5-89d5-a6975b170f07-kube-api-access-p25dg\") pod \"service-ca-operator-777779d784-8jmkl\" (UID: \"d1a1d076-b5f5-41c5-89d5-a6975b170f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582902 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b473ce6c-f37a-472a-a1f2-89332034cdee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q8bsw\" (UID: \"b473ce6c-f37a-472a-a1f2-89332034cdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582926 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5113876-cf94-45b5-9edc-e4ac8af59cb9-apiservice-cert\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582960 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/12e72fe6-ed8c-4f53-a8ee-47e19a656342-profile-collector-cert\") pod \"catalog-operator-68c6474976-mk5xd\" (UID: \"12e72fe6-ed8c-4f53-a8ee-47e19a656342\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.582990 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggjqt\" (UniqueName: \"kubernetes.io/projected/d05ddc98-953b-4b53-8027-ba54f58fdf70-kube-api-access-ggjqt\") pod \"machine-config-server-n884p\" (UID: \"d05ddc98-953b-4b53-8027-ba54f58fdf70\") " pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.583525 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e5113876-cf94-45b5-9edc-e4ac8af59cb9-tmpfs\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.583573 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1c77bff0-0391-4a39-890a-8a81b2924f91-auth-proxy-config\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.585928 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83eaa7e8-5eda-480d-bf44-9bd329c12e8d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qwmhz\" (UID: \"83eaa7e8-5eda-480d-bf44-9bd329c12e8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.597157 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/872e0bc7-2c6e-43cb-98ca-18b85200e276-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wwcrs\" (UID: \"872e0bc7-2c6e-43cb-98ca-18b85200e276\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.600871 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.622464 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.627692 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/872e0bc7-2c6e-43cb-98ca-18b85200e276-config\") pod \"kube-controller-manager-operator-78b949d7b-wwcrs\" (UID: \"872e0bc7-2c6e-43cb-98ca-18b85200e276\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.641346 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.661269 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.680774 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.700431 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.742082 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.745598 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66c42e9-0aba-45bd-867f-6f905804b854-config\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.762266 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.780636 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.802442 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.817719 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d66c42e9-0aba-45bd-867f-6f905804b854-serving-cert\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.821276 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.826700 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d66c42e9-0aba-45bd-867f-6f905804b854-etcd-client\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.841762 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.844664 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d66c42e9-0aba-45bd-867f-6f905804b854-etcd-ca\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.860818 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.863611 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d66c42e9-0aba-45bd-867f-6f905804b854-etcd-service-ca\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.881926 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.902539 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.920924 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.927124 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/51398f58-1dab-4bf2-a7dc-b8669a515200-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-h6fkp\" (UID: \"51398f58-1dab-4bf2-a7dc-b8669a515200\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.940877 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.961276 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 25 12:10:20 crc kubenswrapper[4693]: I1125 12:10:20.981170 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.001683 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.036860 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.037046 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b473ce6c-f37a-472a-a1f2-89332034cdee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q8bsw\" (UID: \"b473ce6c-f37a-472a-a1f2-89332034cdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.045336 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.055922 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/559e62da-b780-4e38-95ee-379cc6066ffa-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v98gn\" (UID: \"559e62da-b780-4e38-95ee-379cc6066ffa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.062726 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.081540 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.083215 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58-signing-cabundle\") pod \"service-ca-9c57cc56f-b7p2s\" (UID: \"0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58\") " pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.100973 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.122006 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.125780 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58-signing-key\") pod \"service-ca-9c57cc56f-b7p2s\" (UID: \"0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58\") " pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.140799 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.160831 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.181541 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.208096 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.214687 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b473ce6c-f37a-472a-a1f2-89332034cdee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q8bsw\" (UID: \"b473ce6c-f37a-472a-a1f2-89332034cdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.222137 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.241272 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.243721 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/559e62da-b780-4e38-95ee-379cc6066ffa-config\") pod \"kube-apiserver-operator-766d6c64bb-v98gn\" (UID: \"559e62da-b780-4e38-95ee-379cc6066ffa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.261819 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.281969 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.302432 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.320898 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.341970 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.360038 4693 request.go:700] Waited for 1.002503811s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.362586 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.381585 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.401949 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.420958 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.442113 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.447107 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-stats-auth\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.461797 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.483113 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.496061 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-default-certificate\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.501702 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.522789 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.537589 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-metrics-certs\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.541619 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.561558 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.563447 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-service-ca-bundle\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.581807 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.582465 4693 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.582497 4693 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.582557 4693 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.582566 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c77bff0-0391-4a39-890a-8a81b2924f91-proxy-tls podName:1c77bff0-0391-4a39-890a-8a81b2924f91 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.082539817 +0000 UTC m=+142.000625218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1c77bff0-0391-4a39-890a-8a81b2924f91-proxy-tls") pod "machine-config-operator-74547568cd-t4r4s" (UID: "1c77bff0-0391-4a39-890a-8a81b2924f91") : failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.582656 4693 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.582704 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/699627e5-fe61-4db8-885c-07ef0e8fb8fc-config podName:699627e5-fe61-4db8-885c-07ef0e8fb8fc nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.08266776 +0000 UTC m=+142.000753181 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/699627e5-fe61-4db8-885c-07ef0e8fb8fc-config") pod "kube-storage-version-migrator-operator-b67b599dd-hnh5z" (UID: "699627e5-fe61-4db8-885c-07ef0e8fb8fc") : failed to sync configmap cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.582740 4693 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.582754 4693 secret.go:188] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.582809 4693 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.582849 4693 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.582905 4693 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583009 4693 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583055 4693 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583069 4693 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583126 4693 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583503 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/667a5672-0a53-472e-a366-1b66dbbc2189-srv-cert podName:667a5672-0a53-472e-a366-1b66dbbc2189 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.082718062 +0000 UTC m=+142.000803483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/667a5672-0a53-472e-a366-1b66dbbc2189-srv-cert") pod "olm-operator-6b444d44fb-7qd8t" (UID: "667a5672-0a53-472e-a366-1b66dbbc2189") : failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583561 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d1a1d076-b5f5-41c5-89d5-a6975b170f07-config podName:d1a1d076-b5f5-41c5-89d5-a6975b170f07 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.083541768 +0000 UTC m=+142.001627189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/d1a1d076-b5f5-41c5-89d5-a6975b170f07-config") pod "service-ca-operator-777779d784-8jmkl" (UID: "d1a1d076-b5f5-41c5-89d5-a6975b170f07") : failed to sync configmap cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583588 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12e72fe6-ed8c-4f53-a8ee-47e19a656342-srv-cert podName:12e72fe6-ed8c-4f53-a8ee-47e19a656342 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.083574889 +0000 UTC m=+142.001660310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/12e72fe6-ed8c-4f53-a8ee-47e19a656342-srv-cert") pod "catalog-operator-68c6474976-mk5xd" (UID: "12e72fe6-ed8c-4f53-a8ee-47e19a656342") : failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583611 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83eaa7e8-5eda-480d-bf44-9bd329c12e8d-proxy-tls podName:83eaa7e8-5eda-480d-bf44-9bd329c12e8d nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.0835997 +0000 UTC m=+142.001685121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/83eaa7e8-5eda-480d-bf44-9bd329c12e8d-proxy-tls") pod "machine-config-controller-84d6567774-qwmhz" (UID: "83eaa7e8-5eda-480d-bf44-9bd329c12e8d") : failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583638 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/438f718d-ef67-42c1-b624-7d69c5e6b13f-config-volume podName:438f718d-ef67-42c1-b624-7d69c5e6b13f nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.083627681 +0000 UTC m=+142.001713102 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/438f718d-ef67-42c1-b624-7d69c5e6b13f-config-volume") pod "collect-profiles-29401200-p77sg" (UID: "438f718d-ef67-42c1-b624-7d69c5e6b13f") : failed to sync configmap cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583663 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c77bff0-0391-4a39-890a-8a81b2924f91-images podName:1c77bff0-0391-4a39-890a-8a81b2924f91 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.083651701 +0000 UTC m=+142.001737122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/1c77bff0-0391-4a39-890a-8a81b2924f91-images") pod "machine-config-operator-74547568cd-t4r4s" (UID: "1c77bff0-0391-4a39-890a-8a81b2924f91") : failed to sync configmap cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583688 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c3a5def-8ee1-4719-ab67-916e2cddc6c6-webhook-certs podName:5c3a5def-8ee1-4719-ab67-916e2cddc6c6 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.083675552 +0000 UTC m=+142.001760973 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5c3a5def-8ee1-4719-ab67-916e2cddc6c6-webhook-certs") pod "multus-admission-controller-857f4d67dd-gstv2" (UID: "5c3a5def-8ee1-4719-ab67-916e2cddc6c6") : failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583713 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/699627e5-fe61-4db8-885c-07ef0e8fb8fc-serving-cert podName:699627e5-fe61-4db8-885c-07ef0e8fb8fc nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.083698913 +0000 UTC m=+142.001784334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/699627e5-fe61-4db8-885c-07ef0e8fb8fc-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-hnh5z" (UID: "699627e5-fe61-4db8-885c-07ef0e8fb8fc") : failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583772 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5113876-cf94-45b5-9edc-e4ac8af59cb9-webhook-cert podName:e5113876-cf94-45b5-9edc-e4ac8af59cb9 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.083726214 +0000 UTC m=+142.001811635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/e5113876-cf94-45b5-9edc-e4ac8af59cb9-webhook-cert") pod "packageserver-d55dfcdfc-jf2cd" (UID: "e5113876-cf94-45b5-9edc-e4ac8af59cb9") : failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583797 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5113876-cf94-45b5-9edc-e4ac8af59cb9-apiservice-cert podName:e5113876-cf94-45b5-9edc-e4ac8af59cb9 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.083786325 +0000 UTC m=+142.001871746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/e5113876-cf94-45b5-9edc-e4ac8af59cb9-apiservice-cert") pod "packageserver-d55dfcdfc-jf2cd" (UID: "e5113876-cf94-45b5-9edc-e4ac8af59cb9") : failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.583826 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1a1d076-b5f5-41c5-89d5-a6975b170f07-serving-cert podName:d1a1d076-b5f5-41c5-89d5-a6975b170f07 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.083811346 +0000 UTC m=+142.001896767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d1a1d076-b5f5-41c5-89d5-a6975b170f07-serving-cert") pod "service-ca-operator-777779d784-8jmkl" (UID: "d1a1d076-b5f5-41c5-89d5-a6975b170f07") : failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.584169 4693 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.584228 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d05ddc98-953b-4b53-8027-ba54f58fdf70-certs podName:d05ddc98-953b-4b53-8027-ba54f58fdf70 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.084206598 +0000 UTC m=+142.002291979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/d05ddc98-953b-4b53-8027-ba54f58fdf70-certs") pod "machine-config-server-n884p" (UID: "d05ddc98-953b-4b53-8027-ba54f58fdf70") : failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.584465 4693 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: E1125 12:10:21.584499 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d05ddc98-953b-4b53-8027-ba54f58fdf70-node-bootstrap-token podName:d05ddc98-953b-4b53-8027-ba54f58fdf70 nodeName:}" failed. No retries permitted until 2025-11-25 12:10:22.084490218 +0000 UTC m=+142.002575599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/d05ddc98-953b-4b53-8027-ba54f58fdf70-node-bootstrap-token") pod "machine-config-server-n884p" (UID: "d05ddc98-953b-4b53-8027-ba54f58fdf70") : failed to sync secret cache: timed out waiting for the condition Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.586006 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/667a5672-0a53-472e-a366-1b66dbbc2189-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7qd8t\" (UID: \"667a5672-0a53-472e-a366-1b66dbbc2189\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.588142 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/12e72fe6-ed8c-4f53-a8ee-47e19a656342-profile-collector-cert\") pod \"catalog-operator-68c6474976-mk5xd\" (UID: \"12e72fe6-ed8c-4f53-a8ee-47e19a656342\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.588441 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/438f718d-ef67-42c1-b624-7d69c5e6b13f-secret-volume\") pod \"collect-profiles-29401200-p77sg\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.603643 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.622060 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.641213 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.661726 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.681847 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.702045 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.724804 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.741751 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.760986 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.780323 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.800968 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.821839 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.841449 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.861463 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.881048 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.901557 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.921457 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.941614 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.960768 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 25 12:10:21 crc kubenswrapper[4693]: I1125 12:10:21.981570 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.003072 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.021530 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.041210 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.061470 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.081814 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112080 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5113876-cf94-45b5-9edc-e4ac8af59cb9-webhook-cert\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112171 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699627e5-fe61-4db8-885c-07ef0e8fb8fc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hnh5z\" (UID: \"699627e5-fe61-4db8-885c-07ef0e8fb8fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112230 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a1d076-b5f5-41c5-89d5-a6975b170f07-serving-cert\") pod \"service-ca-operator-777779d784-8jmkl\" (UID: \"d1a1d076-b5f5-41c5-89d5-a6975b170f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112277 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5113876-cf94-45b5-9edc-e4ac8af59cb9-apiservice-cert\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112425 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/12e72fe6-ed8c-4f53-a8ee-47e19a656342-srv-cert\") pod \"catalog-operator-68c6474976-mk5xd\" (UID: \"12e72fe6-ed8c-4f53-a8ee-47e19a656342\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112485 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1c77bff0-0391-4a39-890a-8a81b2924f91-images\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112534 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/667a5672-0a53-472e-a366-1b66dbbc2189-srv-cert\") pod \"olm-operator-6b444d44fb-7qd8t\" (UID: \"667a5672-0a53-472e-a366-1b66dbbc2189\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112569 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83eaa7e8-5eda-480d-bf44-9bd329c12e8d-proxy-tls\") pod \"machine-config-controller-84d6567774-qwmhz\" (UID: \"83eaa7e8-5eda-480d-bf44-9bd329c12e8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112602 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1c77bff0-0391-4a39-890a-8a81b2924f91-proxy-tls\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112696 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699627e5-fe61-4db8-885c-07ef0e8fb8fc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hnh5z\" (UID: \"699627e5-fe61-4db8-885c-07ef0e8fb8fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112746 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c3a5def-8ee1-4719-ab67-916e2cddc6c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gstv2\" (UID: \"5c3a5def-8ee1-4719-ab67-916e2cddc6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112770 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d05ddc98-953b-4b53-8027-ba54f58fdf70-node-bootstrap-token\") pod \"machine-config-server-n884p\" (UID: \"d05ddc98-953b-4b53-8027-ba54f58fdf70\") " pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112802 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a1d076-b5f5-41c5-89d5-a6975b170f07-config\") pod \"service-ca-operator-777779d784-8jmkl\" (UID: \"d1a1d076-b5f5-41c5-89d5-a6975b170f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112820 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/438f718d-ef67-42c1-b624-7d69c5e6b13f-config-volume\") pod \"collect-profiles-29401200-p77sg\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.112869 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d05ddc98-953b-4b53-8027-ba54f58fdf70-certs\") pod \"machine-config-server-n884p\" (UID: \"d05ddc98-953b-4b53-8027-ba54f58fdf70\") " pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.113855 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1c77bff0-0391-4a39-890a-8a81b2924f91-images\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.114043 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699627e5-fe61-4db8-885c-07ef0e8fb8fc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hnh5z\" (UID: \"699627e5-fe61-4db8-885c-07ef0e8fb8fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.115269 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a1d076-b5f5-41c5-89d5-a6975b170f07-config\") pod \"service-ca-operator-777779d784-8jmkl\" (UID: \"d1a1d076-b5f5-41c5-89d5-a6975b170f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.115779 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/438f718d-ef67-42c1-b624-7d69c5e6b13f-config-volume\") pod \"collect-profiles-29401200-p77sg\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.116722 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5c3a5def-8ee1-4719-ab67-916e2cddc6c6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gstv2\" (UID: \"5c3a5def-8ee1-4719-ab67-916e2cddc6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.117547 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/12e72fe6-ed8c-4f53-a8ee-47e19a656342-srv-cert\") pod \"catalog-operator-68c6474976-mk5xd\" (UID: \"12e72fe6-ed8c-4f53-a8ee-47e19a656342\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.117946 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83eaa7e8-5eda-480d-bf44-9bd329c12e8d-proxy-tls\") pod \"machine-config-controller-84d6567774-qwmhz\" (UID: \"83eaa7e8-5eda-480d-bf44-9bd329c12e8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.118023 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1a1d076-b5f5-41c5-89d5-a6975b170f07-serving-cert\") pod \"service-ca-operator-777779d784-8jmkl\" (UID: \"d1a1d076-b5f5-41c5-89d5-a6975b170f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.118590 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699627e5-fe61-4db8-885c-07ef0e8fb8fc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hnh5z\" (UID: \"699627e5-fe61-4db8-885c-07ef0e8fb8fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.119436 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d05ddc98-953b-4b53-8027-ba54f58fdf70-node-bootstrap-token\") pod \"machine-config-server-n884p\" (UID: \"d05ddc98-953b-4b53-8027-ba54f58fdf70\") " pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.120011 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/667a5672-0a53-472e-a366-1b66dbbc2189-srv-cert\") pod \"olm-operator-6b444d44fb-7qd8t\" (UID: \"667a5672-0a53-472e-a366-1b66dbbc2189\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.120165 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5113876-cf94-45b5-9edc-e4ac8af59cb9-apiservice-cert\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.121107 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1c77bff0-0391-4a39-890a-8a81b2924f91-proxy-tls\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.121949 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.122486 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d05ddc98-953b-4b53-8027-ba54f58fdf70-certs\") pod \"machine-config-server-n884p\" (UID: \"d05ddc98-953b-4b53-8027-ba54f58fdf70\") " pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.122499 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5113876-cf94-45b5-9edc-e4ac8af59cb9-webhook-cert\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.141353 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.162244 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.181724 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.202296 4693 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.222849 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.265252 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e45bad22-7711-44b5-a425-cdc54a795feb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qwhs4\" (UID: \"e45bad22-7711-44b5-a425-cdc54a795feb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.286074 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7919df1-557b-4835-9a9b-680eac28f2c7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zkt9v\" (UID: \"f7919df1-557b-4835-9a9b-680eac28f2c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.297259 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wv4r\" (UniqueName: \"kubernetes.io/projected/c3bdbfb7-27fc-41d4-a157-36363c246c38-kube-api-access-8wv4r\") pod \"oauth-openshift-558db77b4-zm7pc\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.316817 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkn5x\" (UniqueName: \"kubernetes.io/projected/64d12306-ed10-4a16-8c2b-941bfafaa705-kube-api-access-vkn5x\") pod \"openshift-config-operator-7777fb866f-ptflj\" (UID: \"64d12306-ed10-4a16-8c2b-941bfafaa705\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.340836 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw26s\" (UniqueName: \"kubernetes.io/projected/ed425447-604d-40a0-969a-97645b617956-kube-api-access-xw26s\") pod \"openshift-controller-manager-operator-756b6f6bc6-7lrpg\" (UID: \"ed425447-604d-40a0-969a-97645b617956\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.359318 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjw6c\" (UniqueName: \"kubernetes.io/projected/75594753-ed30-49c9-b2ee-b63e64782ab3-kube-api-access-wjw6c\") pod \"apiserver-76f77b778f-c4h8g\" (UID: \"75594753-ed30-49c9-b2ee-b63e64782ab3\") " pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.363123 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.379934 4693 request.go:700] Waited for 1.911873716s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-machine-approver/serviceaccounts/machine-approver-sa/token Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.381780 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/872e0bc7-2c6e-43cb-98ca-18b85200e276-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wwcrs\" (UID: \"872e0bc7-2c6e-43cb-98ca-18b85200e276\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.400655 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s66fd\" (UniqueName: \"kubernetes.io/projected/56882945-e452-4371-bd3b-9fb0e33a5de0-kube-api-access-s66fd\") pod \"machine-approver-56656f9798-7h4dt\" (UID: \"56882945-e452-4371-bd3b-9fb0e33a5de0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.409046 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.417803 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcdsr\" (UniqueName: \"kubernetes.io/projected/725c1b7d-81c5-4bbe-99b1-c53b93754feb-kube-api-access-jcdsr\") pod \"console-f9d7485db-4b2tf\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.420210 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.437989 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a21ef08-fa38-470a-a821-3f39a5d72f23-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-q7dkt\" (UID: \"2a21ef08-fa38-470a-a821-3f39a5d72f23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:22 crc kubenswrapper[4693]: W1125 12:10:22.443144 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56882945_e452_4371_bd3b_9fb0e33a5de0.slice/crio-654d869222c31734e06be358ddf4c55809d37afab5b963230e803ffdae3a2c22 WatchSource:0}: Error finding container 654d869222c31734e06be358ddf4c55809d37afab5b963230e803ffdae3a2c22: Status 404 returned error can't find the container with id 654d869222c31734e06be358ddf4c55809d37afab5b963230e803ffdae3a2c22 Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.464036 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkkk5\" (UniqueName: \"kubernetes.io/projected/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-kube-api-access-dkkk5\") pod \"route-controller-manager-6576b87f9c-slhjf\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.485034 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.486364 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8skz\" (UniqueName: \"kubernetes.io/projected/e45bad22-7711-44b5-a425-cdc54a795feb-kube-api-access-x8skz\") pod \"ingress-operator-5b745b69d9-qwhs4\" (UID: \"e45bad22-7711-44b5-a425-cdc54a795feb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.505019 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.506475 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdmm5\" (UniqueName: \"kubernetes.io/projected/9f4fda99-711f-4947-aae3-55186580a3cc-kube-api-access-mdmm5\") pod \"openshift-apiserver-operator-796bbdcf4f-2mf49\" (UID: \"9f4fda99-711f-4947-aae3-55186580a3cc\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.513626 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.541050 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhv29\" (UniqueName: \"kubernetes.io/projected/da1a8ea0-27fe-434b-9d60-641c1645b75b-kube-api-access-vhv29\") pod \"apiserver-7bbb656c7d-v5rkj\" (UID: \"da1a8ea0-27fe-434b-9d60-641c1645b75b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.542726 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" event={"ID":"56882945-e452-4371-bd3b-9fb0e33a5de0","Type":"ContainerStarted","Data":"654d869222c31734e06be358ddf4c55809d37afab5b963230e803ffdae3a2c22"} Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.545236 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jz8k\" (UniqueName: \"kubernetes.io/projected/bccd7dbe-e658-4ce4-be99-b6642a5bb498-kube-api-access-9jz8k\") pod \"machine-api-operator-5694c8668f-mkt5h\" (UID: \"bccd7dbe-e658-4ce4-be99-b6642a5bb498\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.551683 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.558025 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.566478 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.576226 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xtdr\" (UniqueName: \"kubernetes.io/projected/d07a0f64-0b4d-4719-9a5f-574120ad186a-kube-api-access-6xtdr\") pod \"authentication-operator-69f744f599-9ljwp\" (UID: \"d07a0f64-0b4d-4719-9a5f-574120ad186a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.580756 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmr8k\" (UniqueName: \"kubernetes.io/projected/18eaddfe-4ea4-4581-afe0-b778eb74ff49-kube-api-access-zmr8k\") pod \"cluster-samples-operator-665b6dd947-wj9xr\" (UID: \"18eaddfe-4ea4-4581-afe0-b778eb74ff49\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.599023 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlpv6\" (UniqueName: \"kubernetes.io/projected/2a21ef08-fa38-470a-a821-3f39a5d72f23-kube-api-access-hlpv6\") pod \"cluster-image-registry-operator-dc59b4c8b-q7dkt\" (UID: \"2a21ef08-fa38-470a-a821-3f39a5d72f23\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.601917 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.619445 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.620432 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9q9g\" (UniqueName: \"kubernetes.io/projected/549a8467-81c7-4195-9ead-f0cbdba07d61-kube-api-access-s9q9g\") pod \"console-operator-58897d9998-xtxw4\" (UID: \"549a8467-81c7-4195-9ead-f0cbdba07d61\") " pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.628899 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.639528 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2z9c\" (UniqueName: \"kubernetes.io/projected/413025b6-a706-4ad3-b920-2c9929ddaa0e-kube-api-access-x2z9c\") pod \"controller-manager-879f6c89f-52nbn\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.660606 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6v5r\" (UniqueName: \"kubernetes.io/projected/1579bf21-c619-4e37-b0f4-f7b9727daaf5-kube-api-access-q6v5r\") pod \"dns-operator-744455d44c-7snhp\" (UID: \"1579bf21-c619-4e37-b0f4-f7b9727daaf5\") " pod="openshift-dns-operator/dns-operator-744455d44c-7snhp" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.678015 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2st5n\" (UniqueName: \"kubernetes.io/projected/a9e1e257-9a52-475a-a5ec-cd6fa9449f24-kube-api-access-2st5n\") pod \"downloads-7954f5f757-skxbw\" (UID: \"a9e1e257-9a52-475a-a5ec-cd6fa9449f24\") " pod="openshift-console/downloads-7954f5f757-skxbw" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.693762 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.704130 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.714286 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjm8w\" (UniqueName: \"kubernetes.io/projected/0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58-kube-api-access-pjm8w\") pod \"service-ca-9c57cc56f-b7p2s\" (UID: \"0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58\") " pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.735810 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gprp7\" (UniqueName: \"kubernetes.io/projected/b9acbedc-c0ad-4862-b5f0-05adb69d9bde-kube-api-access-gprp7\") pod \"router-default-5444994796-8r9bv\" (UID: \"b9acbedc-c0ad-4862-b5f0-05adb69d9bde\") " pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.742366 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/559e62da-b780-4e38-95ee-379cc6066ffa-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v98gn\" (UID: \"559e62da-b780-4e38-95ee-379cc6066ffa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.742588 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-skxbw" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.747834 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.762083 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b4h6\" (UniqueName: \"kubernetes.io/projected/51398f58-1dab-4bf2-a7dc-b8669a515200-kube-api-access-6b4h6\") pod \"package-server-manager-789f6589d5-h6fkp\" (UID: \"51398f58-1dab-4bf2-a7dc-b8669a515200\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.765893 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.784080 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8thdg\" (UniqueName: \"kubernetes.io/projected/d66c42e9-0aba-45bd-867f-6f905804b854-kube-api-access-8thdg\") pod \"etcd-operator-b45778765-p8tpr\" (UID: \"d66c42e9-0aba-45bd-867f-6f905804b854\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.800018 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwx97\" (UniqueName: \"kubernetes.io/projected/5c3a5def-8ee1-4719-ab67-916e2cddc6c6-kube-api-access-lwx97\") pod \"multus-admission-controller-857f4d67dd-gstv2\" (UID: \"5c3a5def-8ee1-4719-ab67-916e2cddc6c6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.814987 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8nzl\" (UniqueName: \"kubernetes.io/projected/667a5672-0a53-472e-a366-1b66dbbc2189-kube-api-access-f8nzl\") pod \"olm-operator-6b444d44fb-7qd8t\" (UID: \"667a5672-0a53-472e-a366-1b66dbbc2189\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.819999 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.836999 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.837915 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm8k7\" (UniqueName: \"kubernetes.io/projected/1c77bff0-0391-4a39-890a-8a81b2924f91-kube-api-access-hm8k7\") pod \"machine-config-operator-74547568cd-t4r4s\" (UID: \"1c77bff0-0391-4a39-890a-8a81b2924f91\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.845667 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7snhp" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.863988 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tn76\" (UniqueName: \"kubernetes.io/projected/438f718d-ef67-42c1-b624-7d69c5e6b13f-kube-api-access-2tn76\") pod \"collect-profiles-29401200-p77sg\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.877788 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8slc7\" (UniqueName: \"kubernetes.io/projected/83eaa7e8-5eda-480d-bf44-9bd329c12e8d-kube-api-access-8slc7\") pod \"machine-config-controller-84d6567774-qwmhz\" (UID: \"83eaa7e8-5eda-480d-bf44-9bd329c12e8d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.899203 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plxqt\" (UniqueName: \"kubernetes.io/projected/699627e5-fe61-4db8-885c-07ef0e8fb8fc-kube-api-access-plxqt\") pod \"kube-storage-version-migrator-operator-b67b599dd-hnh5z\" (UID: \"699627e5-fe61-4db8-885c-07ef0e8fb8fc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.928408 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-c4h8g"] Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.936271 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.939325 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pff5h\" (UniqueName: \"kubernetes.io/projected/e5113876-cf94-45b5-9edc-e4ac8af59cb9-kube-api-access-pff5h\") pod \"packageserver-d55dfcdfc-jf2cd\" (UID: \"e5113876-cf94-45b5-9edc-e4ac8af59cb9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.939577 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ptflj"] Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.941746 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.945105 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm2gm\" (UniqueName: \"kubernetes.io/projected/9c2633b3-7860-4862-85de-77bcc6732a6c-kube-api-access-dm2gm\") pod \"migrator-59844c95c7-mpmjj\" (UID: \"9c2633b3-7860-4862-85de-77bcc6732a6c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mpmjj" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.961167 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.962962 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.980539 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p25dg\" (UniqueName: \"kubernetes.io/projected/d1a1d076-b5f5-41c5-89d5-a6975b170f07-kube-api-access-p25dg\") pod \"service-ca-operator-777779d784-8jmkl\" (UID: \"d1a1d076-b5f5-41c5-89d5-a6975b170f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.981644 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfw6p\" (UniqueName: \"kubernetes.io/projected/12e72fe6-ed8c-4f53-a8ee-47e19a656342-kube-api-access-vfw6p\") pod \"catalog-operator-68c6474976-mk5xd\" (UID: \"12e72fe6-ed8c-4f53-a8ee-47e19a656342\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:22 crc kubenswrapper[4693]: I1125 12:10:22.999937 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.002865 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m42r\" (UniqueName: \"kubernetes.io/projected/b473ce6c-f37a-472a-a1f2-89332034cdee-kube-api-access-4m42r\") pod \"marketplace-operator-79b997595-q8bsw\" (UID: \"b473ce6c-f37a-472a-a1f2-89332034cdee\") " pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.010783 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.021910 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.028618 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.039086 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.041997 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggjqt\" (UniqueName: \"kubernetes.io/projected/d05ddc98-953b-4b53-8027-ba54f58fdf70-kube-api-access-ggjqt\") pod \"machine-config-server-n884p\" (UID: \"d05ddc98-953b-4b53-8027-ba54f58fdf70\") " pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.067623 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n884p" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.067719 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.072711 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.082853 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.092395 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.094522 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.109088 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.136930 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-registry-tls\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.136978 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhk69\" (UniqueName: \"kubernetes.io/projected/9e5a9da0-728f-4181-9912-fa57f2e2fb3a-kube-api-access-lhk69\") pod \"ingress-canary-2ggsx\" (UID: \"9e5a9da0-728f-4181-9912-fa57f2e2fb3a\") " pod="openshift-ingress-canary/ingress-canary-2ggsx" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.137022 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e5a9da0-728f-4181-9912-fa57f2e2fb3a-cert\") pod \"ingress-canary-2ggsx\" (UID: \"9e5a9da0-728f-4181-9912-fa57f2e2fb3a\") " pod="openshift-ingress-canary/ingress-canary-2ggsx" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.137045 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.137070 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-bound-sa-token\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.137178 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.137215 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-registry-certificates\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.137278 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-trusted-ca\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.137304 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz25z\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-kube-api-access-vz25z\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.137405 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/264f1d17-cf59-4dbf-ad2f-0272713fe3b0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5v4pg\" (UID: \"264f1d17-cf59-4dbf-ad2f-0272713fe3b0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.137431 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.137455 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqg5z\" (UniqueName: \"kubernetes.io/projected/264f1d17-cf59-4dbf-ad2f-0272713fe3b0-kube-api-access-dqg5z\") pod \"control-plane-machine-set-operator-78cbb6b69f-5v4pg\" (UID: \"264f1d17-cf59-4dbf-ad2f-0272713fe3b0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg" Nov 25 12:10:23 crc kubenswrapper[4693]: E1125 12:10:23.137803 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:23.637784884 +0000 UTC m=+143.555870265 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.140119 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zm7pc"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.140957 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4b2tf"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.143488 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.148149 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.149285 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4"] Nov 25 12:10:23 crc kubenswrapper[4693]: W1125 12:10:23.194065 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7919df1_557b_4835_9a9b_680eac28f2c7.slice/crio-4d6f94f4bbbfb3350cbef6277404aec9b7f7b5f53f90fe579e1eefec967b46e7 WatchSource:0}: Error finding container 4d6f94f4bbbfb3350cbef6277404aec9b7f7b5f53f90fe579e1eefec967b46e7: Status 404 returned error can't find the container with id 4d6f94f4bbbfb3350cbef6277404aec9b7f7b5f53f90fe579e1eefec967b46e7 Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.229943 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mpmjj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.238189 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.238521 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-trusted-ca\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.238546 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba-config-volume\") pod \"dns-default-jtvmf\" (UID: \"e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba\") " pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.238606 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz25z\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-kube-api-access-vz25z\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.238661 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba-metrics-tls\") pod \"dns-default-jtvmf\" (UID: \"e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba\") " pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.238771 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/264f1d17-cf59-4dbf-ad2f-0272713fe3b0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5v4pg\" (UID: \"264f1d17-cf59-4dbf-ad2f-0272713fe3b0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.238800 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-csi-data-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.238823 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.238898 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqg5z\" (UniqueName: \"kubernetes.io/projected/264f1d17-cf59-4dbf-ad2f-0272713fe3b0-kube-api-access-dqg5z\") pod \"control-plane-machine-set-operator-78cbb6b69f-5v4pg\" (UID: \"264f1d17-cf59-4dbf-ad2f-0272713fe3b0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.238989 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5jvr\" (UniqueName: \"kubernetes.io/projected/e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba-kube-api-access-r5jvr\") pod \"dns-default-jtvmf\" (UID: \"e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba\") " pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.239016 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjhgg\" (UniqueName: \"kubernetes.io/projected/cb5dac5c-eefa-4f03-abd2-8619af829eff-kube-api-access-vjhgg\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.239129 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-registry-tls\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.239175 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-socket-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.239243 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhk69\" (UniqueName: \"kubernetes.io/projected/9e5a9da0-728f-4181-9912-fa57f2e2fb3a-kube-api-access-lhk69\") pod \"ingress-canary-2ggsx\" (UID: \"9e5a9da0-728f-4181-9912-fa57f2e2fb3a\") " pod="openshift-ingress-canary/ingress-canary-2ggsx" Nov 25 12:10:23 crc kubenswrapper[4693]: E1125 12:10:23.239660 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:23.739634591 +0000 UTC m=+143.657719972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.240220 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e5a9da0-728f-4181-9912-fa57f2e2fb3a-cert\") pod \"ingress-canary-2ggsx\" (UID: \"9e5a9da0-728f-4181-9912-fa57f2e2fb3a\") " pod="openshift-ingress-canary/ingress-canary-2ggsx" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.240290 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.240323 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-mountpoint-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.240418 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-bound-sa-token\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.242406 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-trusted-ca\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.248355 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.249311 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-registry-tls\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.249655 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.249706 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-plugins-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.254754 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.255189 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-registry-certificates\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.253884 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/264f1d17-cf59-4dbf-ad2f-0272713fe3b0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5v4pg\" (UID: \"264f1d17-cf59-4dbf-ad2f-0272713fe3b0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg" Nov 25 12:10:23 crc kubenswrapper[4693]: E1125 12:10:23.256421 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:23.756405153 +0000 UTC m=+143.674490534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.262879 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-registration-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.257352 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-registry-certificates\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.258326 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.280024 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhk69\" (UniqueName: \"kubernetes.io/projected/9e5a9da0-728f-4181-9912-fa57f2e2fb3a-kube-api-access-lhk69\") pod \"ingress-canary-2ggsx\" (UID: \"9e5a9da0-728f-4181-9912-fa57f2e2fb3a\") " pod="openshift-ingress-canary/ingress-canary-2ggsx" Nov 25 12:10:23 crc kubenswrapper[4693]: W1125 12:10:23.284350 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode45bad22_7711_44b5_a425_cdc54a795feb.slice/crio-82bfdb9bc6a682918bb2cf2183bb091f96a6bb8a27bbb84ee6fb6591bf5053a8 WatchSource:0}: Error finding container 82bfdb9bc6a682918bb2cf2183bb091f96a6bb8a27bbb84ee6fb6591bf5053a8: Status 404 returned error can't find the container with id 82bfdb9bc6a682918bb2cf2183bb091f96a6bb8a27bbb84ee6fb6591bf5053a8 Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.301746 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz25z\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-kube-api-access-vz25z\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.304876 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e5a9da0-728f-4181-9912-fa57f2e2fb3a-cert\") pod \"ingress-canary-2ggsx\" (UID: \"9e5a9da0-728f-4181-9912-fa57f2e2fb3a\") " pod="openshift-ingress-canary/ingress-canary-2ggsx" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.338264 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-bound-sa-token\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.365672 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:23 crc kubenswrapper[4693]: E1125 12:10:23.365896 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:23.865872286 +0000 UTC m=+143.783957667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.366094 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqg5z\" (UniqueName: \"kubernetes.io/projected/264f1d17-cf59-4dbf-ad2f-0272713fe3b0-kube-api-access-dqg5z\") pod \"control-plane-machine-set-operator-78cbb6b69f-5v4pg\" (UID: \"264f1d17-cf59-4dbf-ad2f-0272713fe3b0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.366405 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-registration-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.366431 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba-config-volume\") pod \"dns-default-jtvmf\" (UID: \"e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba\") " pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.366459 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba-metrics-tls\") pod \"dns-default-jtvmf\" (UID: \"e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba\") " pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.366501 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-csi-data-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.366524 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5jvr\" (UniqueName: \"kubernetes.io/projected/e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba-kube-api-access-r5jvr\") pod \"dns-default-jtvmf\" (UID: \"e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba\") " pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.366546 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjhgg\" (UniqueName: \"kubernetes.io/projected/cb5dac5c-eefa-4f03-abd2-8619af829eff-kube-api-access-vjhgg\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.366582 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-socket-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.366631 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-mountpoint-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.366674 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-plugins-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.366699 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.366791 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-csi-data-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: E1125 12:10:23.367052 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:23.867036922 +0000 UTC m=+143.785122503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.367071 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-registration-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.367291 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-socket-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.367341 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-mountpoint-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.367387 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cb5dac5c-eefa-4f03-abd2-8619af829eff-plugins-dir\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.375568 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba-metrics-tls\") pod \"dns-default-jtvmf\" (UID: \"e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba\") " pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.399548 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba-config-volume\") pod \"dns-default-jtvmf\" (UID: \"e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba\") " pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.426343 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5jvr\" (UniqueName: \"kubernetes.io/projected/e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba-kube-api-access-r5jvr\") pod \"dns-default-jtvmf\" (UID: \"e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba\") " pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.443423 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjhgg\" (UniqueName: \"kubernetes.io/projected/cb5dac5c-eefa-4f03-abd2-8619af829eff-kube-api-access-vjhgg\") pod \"csi-hostpathplugin-nckmj\" (UID: \"cb5dac5c-eefa-4f03-abd2-8619af829eff\") " pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.472469 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:23 crc kubenswrapper[4693]: E1125 12:10:23.472897 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:23.972882093 +0000 UTC m=+143.890967464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:23 crc kubenswrapper[4693]: W1125 12:10:23.552461 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd05ddc98_953b_4b53_8027_ba54f58fdf70.slice/crio-361f091a74af66061537fcbb2f777efca956fa891c09530cf9f13dfd1b4c2e5b WatchSource:0}: Error finding container 361f091a74af66061537fcbb2f777efca956fa891c09530cf9f13dfd1b4c2e5b: Status 404 returned error can't find the container with id 361f091a74af66061537fcbb2f777efca956fa891c09530cf9f13dfd1b4c2e5b Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.571746 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8r9bv" event={"ID":"b9acbedc-c0ad-4862-b5f0-05adb69d9bde","Type":"ContainerStarted","Data":"87ee92d2c23f8aaf2992431678ef3b514b17bf02c57e91e9bd7b17a514ccadad"} Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.574260 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" event={"ID":"c3bdbfb7-27fc-41d4-a157-36363c246c38","Type":"ContainerStarted","Data":"d842ea447c1ed0961d587012212626adf65513af7fb89f71f898740666671a0c"} Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.575925 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: E1125 12:10:23.576339 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:24.076323559 +0000 UTC m=+143.994408930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.578852 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2ggsx" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.583407 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" event={"ID":"64d12306-ed10-4a16-8c2b-941bfafaa705","Type":"ContainerStarted","Data":"80ab5f837460cefac4a9c5b6894003ac5cc16be416d097f813b47e1a84ee97de"} Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.583465 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" event={"ID":"64d12306-ed10-4a16-8c2b-941bfafaa705","Type":"ContainerStarted","Data":"c4024ef0e256ad940a9dd0cdadd212b63eeacefc1569506e703be18105d09f74"} Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.584712 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" event={"ID":"e45bad22-7711-44b5-a425-cdc54a795feb","Type":"ContainerStarted","Data":"82bfdb9bc6a682918bb2cf2183bb091f96a6bb8a27bbb84ee6fb6591bf5053a8"} Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.584961 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.587056 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" event={"ID":"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b","Type":"ContainerStarted","Data":"15c4a35de0fbe2b6caf31df7c0b3bb7388d92a1d76aa302a2e1969f551f07abc"} Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.597500 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" event={"ID":"75594753-ed30-49c9-b2ee-b63e64782ab3","Type":"ContainerStarted","Data":"a02c2e3f37c5bc8c0a36a8239c71ce44bfb08577379b9868863c78dd223d4436"} Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.600906 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4b2tf" event={"ID":"725c1b7d-81c5-4bbe-99b1-c53b93754feb","Type":"ContainerStarted","Data":"eb00f2113aa83042f6c2ef890aee61801c3239fd27fb53f9074ee3c4175991a5"} Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.608429 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" event={"ID":"f7919df1-557b-4835-9a9b-680eac28f2c7","Type":"ContainerStarted","Data":"4d6f94f4bbbfb3350cbef6277404aec9b7f7b5f53f90fe579e1eefec967b46e7"} Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.609338 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mkt5h"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.611841 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.613048 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" event={"ID":"ed425447-604d-40a0-969a-97645b617956","Type":"ContainerStarted","Data":"bef64a43c0f333c48742cc11304666efcf9d51c4f65a190de026324ef50f302f"} Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.617511 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" event={"ID":"56882945-e452-4371-bd3b-9fb0e33a5de0","Type":"ContainerStarted","Data":"a64651822cff0052038ffe97c7f52dc1e26d6f4b044215e59b108f36c0b2c38a"} Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.628764 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.632877 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-52nbn"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.634984 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9ljwp"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.677497 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:23 crc kubenswrapper[4693]: E1125 12:10:23.677996 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:24.177931229 +0000 UTC m=+144.096016610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.692328 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: E1125 12:10:23.693225 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:24.193207284 +0000 UTC m=+144.111292665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.702184 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:23 crc kubenswrapper[4693]: W1125 12:10:23.715432 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod413025b6_a706_4ad3_b920_2c9929ddaa0e.slice/crio-75b1b27f86fa619d65d7ff0dc6a126f56683fc781ed84bf3dbe3be76ecc66558 WatchSource:0}: Error finding container 75b1b27f86fa619d65d7ff0dc6a126f56683fc781ed84bf3dbe3be76ecc66558: Status 404 returned error can't find the container with id 75b1b27f86fa619d65d7ff0dc6a126f56683fc781ed84bf3dbe3be76ecc66558 Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.730863 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-nckmj" Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.795218 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:23 crc kubenswrapper[4693]: E1125 12:10:23.795562 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:24.295545946 +0000 UTC m=+144.213631317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.820714 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.826221 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.844096 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-skxbw"] Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.897420 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:23 crc kubenswrapper[4693]: E1125 12:10:23.897761 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:24.397748044 +0000 UTC m=+144.315833425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:23 crc kubenswrapper[4693]: I1125 12:10:23.998223 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:23 crc kubenswrapper[4693]: E1125 12:10:23.999421 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:24.498994272 +0000 UTC m=+144.417079653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.062145 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7snhp"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.084249 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xtxw4"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.102263 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:24 crc kubenswrapper[4693]: E1125 12:10:24.102642 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:24.602628364 +0000 UTC m=+144.520713745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.104919 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p8tpr"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.112155 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-b7p2s"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.120415 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.204790 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:24 crc kubenswrapper[4693]: E1125 12:10:24.205517 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:24.705498243 +0000 UTC m=+144.623583624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.303619 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.306690 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.307827 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:24 crc kubenswrapper[4693]: E1125 12:10:24.308407 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:24.808385931 +0000 UTC m=+144.726471312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.309432 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q8bsw"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.310633 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.318930 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp"] Nov 25 12:10:24 crc kubenswrapper[4693]: W1125 12:10:24.327986 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83eaa7e8_5eda_480d_bf44_9bd329c12e8d.slice/crio-52686a17dda0163a6fc2fb00257acb9e8cef69722f1214a3ca132a713da0b225 WatchSource:0}: Error finding container 52686a17dda0163a6fc2fb00257acb9e8cef69722f1214a3ca132a713da0b225: Status 404 returned error can't find the container with id 52686a17dda0163a6fc2fb00257acb9e8cef69722f1214a3ca132a713da0b225 Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.353446 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.355753 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.357210 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.358362 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.360051 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gstv2"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.362027 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z"] Nov 25 12:10:24 crc kubenswrapper[4693]: W1125 12:10:24.363569 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51398f58_1dab_4bf2_a7dc_b8669a515200.slice/crio-ec18dc5e098b402addb4583f6e19bedf88ea52b6bbd26d27daf08cb070bdc11f WatchSource:0}: Error finding container ec18dc5e098b402addb4583f6e19bedf88ea52b6bbd26d27daf08cb070bdc11f: Status 404 returned error can't find the container with id ec18dc5e098b402addb4583f6e19bedf88ea52b6bbd26d27daf08cb070bdc11f Nov 25 12:10:24 crc kubenswrapper[4693]: W1125 12:10:24.369001 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c77bff0_0391_4a39_890a_8a81b2924f91.slice/crio-f9ba5ce31a071b9c27365b06198e121d02891888b1dff89d5368b779d0717f9d WatchSource:0}: Error finding container f9ba5ce31a071b9c27365b06198e121d02891888b1dff89d5368b779d0717f9d: Status 404 returned error can't find the container with id f9ba5ce31a071b9c27365b06198e121d02891888b1dff89d5368b779d0717f9d Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.409303 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:24 crc kubenswrapper[4693]: E1125 12:10:24.410236 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:24.910220208 +0000 UTC m=+144.828305589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.497347 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.499842 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2ggsx"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.501380 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mpmjj"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.503469 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.511682 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:24 crc kubenswrapper[4693]: E1125 12:10:24.513150 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.013131907 +0000 UTC m=+144.931217498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.517344 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jtvmf"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.556743 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nckmj"] Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.615876 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:24 crc kubenswrapper[4693]: E1125 12:10:24.615989 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.115972565 +0000 UTC m=+145.034057946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.616355 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:24 crc kubenswrapper[4693]: E1125 12:10:24.617015 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.116998767 +0000 UTC m=+145.035084148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.623350 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" event={"ID":"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b","Type":"ContainerStarted","Data":"7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.624102 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" event={"ID":"d1a1d076-b5f5-41c5-89d5-a6975b170f07","Type":"ContainerStarted","Data":"1d3041dcc996b6f0e04f1ac8ed0aee0ae352b1d1b3e6f31875f8c6defeaf07f3"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.624846 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" event={"ID":"d07a0f64-0b4d-4719-9a5f-574120ad186a","Type":"ContainerStarted","Data":"3aa19ec6b3a3d43dffff7ecb02f0d92b073cd98c48e04f19df98307812cd72f6"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.625654 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" event={"ID":"699627e5-fe61-4db8-885c-07ef0e8fb8fc","Type":"ContainerStarted","Data":"acf12a5a61fd112421a2958b03537819f66bc5ad6cb0140225425cbca830cd77"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.628161 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" event={"ID":"9f4fda99-711f-4947-aae3-55186580a3cc","Type":"ContainerStarted","Data":"6920547e99cf1065d96f8fa0243f4506a9cfcb35e697ccf9f0f368f38a6c546c"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.631105 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" event={"ID":"413025b6-a706-4ad3-b920-2c9929ddaa0e","Type":"ContainerStarted","Data":"75b1b27f86fa619d65d7ff0dc6a126f56683fc781ed84bf3dbe3be76ecc66558"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.634907 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" event={"ID":"667a5672-0a53-472e-a366-1b66dbbc2189","Type":"ContainerStarted","Data":"ddbe7e957167d3dde5d5b724b88854bde47ae28c3aa00433d34b6bc8a04b6334"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.636117 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" event={"ID":"1c77bff0-0391-4a39-890a-8a81b2924f91","Type":"ContainerStarted","Data":"f9ba5ce31a071b9c27365b06198e121d02891888b1dff89d5368b779d0717f9d"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.637465 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" event={"ID":"f7919df1-557b-4835-9a9b-680eac28f2c7","Type":"ContainerStarted","Data":"0f1c6b4ff676ce20d93d08e75dc1e391ca1a26f1ce10dd15dccc9bf9107f4d53"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.638293 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7snhp" event={"ID":"1579bf21-c619-4e37-b0f4-f7b9727daaf5","Type":"ContainerStarted","Data":"ff822560dcd31a2af153909dbcbedada5c5b8c45f8df8de1a1033873ef9ff288"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.639133 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" event={"ID":"0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58","Type":"ContainerStarted","Data":"9c6d28270b9be2672b80770d7a2547e2f1d2ba3da4f7b3996f914dcb3669d165"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.639973 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" event={"ID":"d66c42e9-0aba-45bd-867f-6f905804b854","Type":"ContainerStarted","Data":"3c1d1f8c60d49c18f6097dede6888c9f62f889f4f22d6dcf54857adb34216b46"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.640951 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xtxw4" event={"ID":"549a8467-81c7-4195-9ead-f0cbdba07d61","Type":"ContainerStarted","Data":"563f46631d9df1113e098ef3a7d401311a67f7c7b8a098979851183422a91f03"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.642128 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" event={"ID":"438f718d-ef67-42c1-b624-7d69c5e6b13f","Type":"ContainerStarted","Data":"2ba66cee4d508a6bceabb73b4520cc02861b78a92c739de64884f39bd484ee95"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.643226 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" event={"ID":"872e0bc7-2c6e-43cb-98ca-18b85200e276","Type":"ContainerStarted","Data":"003be1b7b9d56965257f5d844025b38e33571052a6d99e4955509ac58d7c6fb9"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.644523 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" event={"ID":"bccd7dbe-e658-4ce4-be99-b6642a5bb498","Type":"ContainerStarted","Data":"5b58531861f3d671807a47b74421dd6402524ff8e25a607ce4ec293e1e41c7e3"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.645850 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" event={"ID":"ed425447-604d-40a0-969a-97645b617956","Type":"ContainerStarted","Data":"a63e499d96d57ae7eca7d0945d349d2765c50a811fb55e765d62ed90f9053eac"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.647706 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n884p" event={"ID":"d05ddc98-953b-4b53-8027-ba54f58fdf70","Type":"ContainerStarted","Data":"318232b68d16230fe5f8de568528bdc36e6d9e890052f009c078cb235ff3e6e6"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.647753 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n884p" event={"ID":"d05ddc98-953b-4b53-8027-ba54f58fdf70","Type":"ContainerStarted","Data":"361f091a74af66061537fcbb2f777efca956fa891c09530cf9f13dfd1b4c2e5b"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.649440 4693 generic.go:334] "Generic (PLEG): container finished" podID="64d12306-ed10-4a16-8c2b-941bfafaa705" containerID="80ab5f837460cefac4a9c5b6894003ac5cc16be416d097f813b47e1a84ee97de" exitCode=0 Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.649518 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" event={"ID":"64d12306-ed10-4a16-8c2b-941bfafaa705","Type":"ContainerDied","Data":"80ab5f837460cefac4a9c5b6894003ac5cc16be416d097f813b47e1a84ee97de"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.651532 4693 generic.go:334] "Generic (PLEG): container finished" podID="75594753-ed30-49c9-b2ee-b63e64782ab3" containerID="5dbcde8837c074fc5b8c420782c34cd76ff39e5f8d6b164d8890ffc06b9c0c35" exitCode=0 Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.651593 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" event={"ID":"75594753-ed30-49c9-b2ee-b63e64782ab3","Type":"ContainerDied","Data":"5dbcde8837c074fc5b8c420782c34cd76ff39e5f8d6b164d8890ffc06b9c0c35"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.652760 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-skxbw" event={"ID":"a9e1e257-9a52-475a-a5ec-cd6fa9449f24","Type":"ContainerStarted","Data":"598b46e0abae49198c861ae673397b5082729175fae7020881e5e9c6e50fec13"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.654143 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4b2tf" event={"ID":"725c1b7d-81c5-4bbe-99b1-c53b93754feb","Type":"ContainerStarted","Data":"5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.655346 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" event={"ID":"12e72fe6-ed8c-4f53-a8ee-47e19a656342","Type":"ContainerStarted","Data":"49be498846cccf59475e2d89ce5799fb1cd41245a82d1f2e02dc446e86304167"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.656343 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" event={"ID":"51398f58-1dab-4bf2-a7dc-b8669a515200","Type":"ContainerStarted","Data":"ec18dc5e098b402addb4583f6e19bedf88ea52b6bbd26d27daf08cb070bdc11f"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.657433 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" event={"ID":"b473ce6c-f37a-472a-a1f2-89332034cdee","Type":"ContainerStarted","Data":"47d140250ec5a46a9e6329df747ce24cc3d94f3e25b2910fdcb1b818fd72ae4c"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.658561 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" event={"ID":"83eaa7e8-5eda-480d-bf44-9bd329c12e8d","Type":"ContainerStarted","Data":"52686a17dda0163a6fc2fb00257acb9e8cef69722f1214a3ca132a713da0b225"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.659713 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" event={"ID":"2a21ef08-fa38-470a-a821-3f39a5d72f23","Type":"ContainerStarted","Data":"c0611ece802206a1c11586e0697d6c6690b248ce3c4088bd5063ee78deadbeb8"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.660984 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" event={"ID":"da1a8ea0-27fe-434b-9d60-641c1645b75b","Type":"ContainerStarted","Data":"120c306c9c15affc3d8449f07decf0738346ac8d15257b0e76bf6dd21ec796e5"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.661812 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" event={"ID":"5c3a5def-8ee1-4719-ab67-916e2cddc6c6","Type":"ContainerStarted","Data":"6dc37ff31957659e21866c456b1ff653df1890349e10379d4521125369a944ab"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.662779 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" event={"ID":"559e62da-b780-4e38-95ee-379cc6066ffa","Type":"ContainerStarted","Data":"c8f432f95baa9e52e26371e79b844f41d9ffc722cc5174d801cfd8c30acd9963"} Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.717986 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:24 crc kubenswrapper[4693]: E1125 12:10:24.719387 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.21935195 +0000 UTC m=+145.137437331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:24 crc kubenswrapper[4693]: W1125 12:10:24.726684 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8ba2980_9fa8_4674_a3c9_b10f0b4df7ba.slice/crio-12f087a12dbb644d3bf12977ebc7c178ab4a7bb0b5187ba1db97faf09542d647 WatchSource:0}: Error finding container 12f087a12dbb644d3bf12977ebc7c178ab4a7bb0b5187ba1db97faf09542d647: Status 404 returned error can't find the container with id 12f087a12dbb644d3bf12977ebc7c178ab4a7bb0b5187ba1db97faf09542d647 Nov 25 12:10:24 crc kubenswrapper[4693]: W1125 12:10:24.734874 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb5dac5c_eefa_4f03_abd2_8619af829eff.slice/crio-9b7a2560ad97ce622a13152a6d7331eb7d2a84ab40c403f8f54022d97d93e4d9 WatchSource:0}: Error finding container 9b7a2560ad97ce622a13152a6d7331eb7d2a84ab40c403f8f54022d97d93e4d9: Status 404 returned error can't find the container with id 9b7a2560ad97ce622a13152a6d7331eb7d2a84ab40c403f8f54022d97d93e4d9 Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.819677 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:24 crc kubenswrapper[4693]: E1125 12:10:24.820996 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.320975979 +0000 UTC m=+145.239061560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.921272 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:24 crc kubenswrapper[4693]: E1125 12:10:24.921483 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.421458374 +0000 UTC m=+145.339543755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:24 crc kubenswrapper[4693]: I1125 12:10:24.921534 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:24 crc kubenswrapper[4693]: E1125 12:10:24.921886 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.421871036 +0000 UTC m=+145.339956417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.023182 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.023615 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.523588279 +0000 UTC m=+145.441673660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.023761 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.024100 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.524087114 +0000 UTC m=+145.442172495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.125309 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.125946 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.625924751 +0000 UTC m=+145.544010132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.227998 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.228868 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.728853481 +0000 UTC m=+145.646938852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.329324 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.329843 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.829821621 +0000 UTC m=+145.747907002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.430847 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.431319 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:25.931302236 +0000 UTC m=+145.849387617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.535010 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.535775 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.035737414 +0000 UTC m=+145.953822805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.535905 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.537179 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.037164048 +0000 UTC m=+145.955249429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.637008 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.637141 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.137113565 +0000 UTC m=+146.055198956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.637262 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.637816 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.137804007 +0000 UTC m=+146.055889388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.670463 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" event={"ID":"c3bdbfb7-27fc-41d4-a157-36363c246c38","Type":"ContainerStarted","Data":"7c586a16827a9c02f4fc8d83ff6e9b828815ab99ee657a7ddc97e366593c0961"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.670984 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.674539 4693 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-zm7pc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.674604 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" podUID="c3bdbfb7-27fc-41d4-a157-36363c246c38" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.675822 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" event={"ID":"438f718d-ef67-42c1-b624-7d69c5e6b13f","Type":"ContainerStarted","Data":"fc09e76ed5002e18747cc585fb18fd30f052755db3f531c3ab13577b124cbc10"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.677621 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" event={"ID":"d07a0f64-0b4d-4719-9a5f-574120ad186a","Type":"ContainerStarted","Data":"3518d27a10f5d92affe3c7c18d81d63b8ede3ec108cf00cfc5e881596f7b18fd"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.685810 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" event={"ID":"1c77bff0-0391-4a39-890a-8a81b2924f91","Type":"ContainerStarted","Data":"a77ba658998af761c2fba64f1f73ad1f8b4342920692ed799413c6356649f7cd"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.704781 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" podStartSLOduration=120.704745548 podStartE2EDuration="2m0.704745548s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:25.703844361 +0000 UTC m=+145.621929762" watchObservedRunningTime="2025-11-25 12:10:25.704745548 +0000 UTC m=+145.622830929" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.718288 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr" event={"ID":"18eaddfe-4ea4-4581-afe0-b778eb74ff49","Type":"ContainerStarted","Data":"14f1d2e1e22912142d703076b545304a49d0f622f3dea52e8387ee52d55a6588"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.720724 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" event={"ID":"51398f58-1dab-4bf2-a7dc-b8669a515200","Type":"ContainerStarted","Data":"803025893dabc0036cc27dadb101ac8ce967aae68ff5c0ed2724d3ce194d40a9"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.730933 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-9ljwp" podStartSLOduration=120.730880211 podStartE2EDuration="2m0.730880211s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:25.72924557 +0000 UTC m=+145.647330961" watchObservedRunningTime="2025-11-25 12:10:25.730880211 +0000 UTC m=+145.648965592" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.735122 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" event={"ID":"872e0bc7-2c6e-43cb-98ca-18b85200e276","Type":"ContainerStarted","Data":"a8eead0d9c2660bb806ff197994c827e1fa40fb77d510da33678490b0f17e484"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.739018 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.740790 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.240755838 +0000 UTC m=+146.158841219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.747427 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" event={"ID":"e45bad22-7711-44b5-a425-cdc54a795feb","Type":"ContainerStarted","Data":"cfb297611df6b61c8b4ef2095b88e353e29648b17b7298e9f2d1a5d5260448ab"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.753278 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mpmjj" event={"ID":"9c2633b3-7860-4862-85de-77bcc6732a6c","Type":"ContainerStarted","Data":"232dc29a3bb17ef2c255abe70af588caa6e091f73421530b5f3c31c0f379a02d"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.754314 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jtvmf" event={"ID":"e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba","Type":"ContainerStarted","Data":"12f087a12dbb644d3bf12977ebc7c178ab4a7bb0b5187ba1db97faf09542d647"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.793610 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" event={"ID":"d66c42e9-0aba-45bd-867f-6f905804b854","Type":"ContainerStarted","Data":"6a41cb547e9d3500b2ee676bfe4b83c80143fc88eb2d44c1ff5cb4ae081f3aee"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.809291 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" event={"ID":"56882945-e452-4371-bd3b-9fb0e33a5de0","Type":"ContainerStarted","Data":"0176f01cb8e57b0286eb3c0251b36f224276ef26be272d6d667dd0be68d504ef"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.832832 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7h4dt" podStartSLOduration=120.83280165 podStartE2EDuration="2m0.83280165s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:25.832134599 +0000 UTC m=+145.750219990" watchObservedRunningTime="2025-11-25 12:10:25.83280165 +0000 UTC m=+145.750887031" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.834555 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wwcrs" podStartSLOduration=119.834548374 podStartE2EDuration="1m59.834548374s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:25.788351888 +0000 UTC m=+145.706437269" watchObservedRunningTime="2025-11-25 12:10:25.834548374 +0000 UTC m=+145.752633745" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.841708 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.850342 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.350298344 +0000 UTC m=+146.268383725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.852725 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" event={"ID":"b473ce6c-f37a-472a-a1f2-89332034cdee","Type":"ContainerStarted","Data":"ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.853171 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.856185 4693 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-q8bsw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.856241 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" podUID="b473ce6c-f37a-472a-a1f2-89332034cdee" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.862558 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8r9bv" event={"ID":"b9acbedc-c0ad-4862-b5f0-05adb69d9bde","Type":"ContainerStarted","Data":"d904da76f3836783c27757302a4358eadcfe527aa63df3a65156861647369bfd"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.873341 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" event={"ID":"667a5672-0a53-472e-a366-1b66dbbc2189","Type":"ContainerStarted","Data":"5ff9db3e687b68b0d9d3daa679c7d09686f279a92cf914b8e2a0b7e7184cb1da"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.873601 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" podStartSLOduration=119.873576348 podStartE2EDuration="1m59.873576348s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:25.872642788 +0000 UTC m=+145.790728169" watchObservedRunningTime="2025-11-25 12:10:25.873576348 +0000 UTC m=+145.791661739" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.874302 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.876511 4693 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7qd8t container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.876552 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" podUID="667a5672-0a53-472e-a366-1b66dbbc2189" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.879988 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" event={"ID":"413025b6-a706-4ad3-b920-2c9929ddaa0e","Type":"ContainerStarted","Data":"bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.880731 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.884824 4693 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-52nbn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.884872 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" podUID="413025b6-a706-4ad3-b920-2c9929ddaa0e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.885238 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-skxbw" event={"ID":"a9e1e257-9a52-475a-a5ec-cd6fa9449f24","Type":"ContainerStarted","Data":"e0bce37eb650ea3ad7bb4b511c8e38dae1f62677f6b0c7632dcd0b0d5df82f33"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.885649 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-skxbw" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.889383 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.889439 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.897097 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" event={"ID":"e5113876-cf94-45b5-9edc-e4ac8af59cb9","Type":"ContainerStarted","Data":"9a3a262e1349298f8304d44dd1ec31a3e43e96b465f30163574803fbb7f6398c"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.897447 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.902544 4693 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jf2cd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.902598 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" podUID="e5113876-cf94-45b5-9edc-e4ac8af59cb9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.903619 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8r9bv" podStartSLOduration=119.903604401 podStartE2EDuration="1m59.903604401s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:25.903208779 +0000 UTC m=+145.821294160" watchObservedRunningTime="2025-11-25 12:10:25.903604401 +0000 UTC m=+145.821689782" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.906604 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" event={"ID":"bccd7dbe-e658-4ce4-be99-b6642a5bb498","Type":"ContainerStarted","Data":"28885f5c9be697d595c1fbd1d7acdb4066b84d9b48c42641bb23b9454a068720"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.912253 4693 generic.go:334] "Generic (PLEG): container finished" podID="da1a8ea0-27fe-434b-9d60-641c1645b75b" containerID="a131b4ff7f9eed8c61a1dbc01ea95ed5f70ea494f4ac98203f271c7f941ca028" exitCode=0 Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.912328 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" event={"ID":"da1a8ea0-27fe-434b-9d60-641c1645b75b","Type":"ContainerDied","Data":"a131b4ff7f9eed8c61a1dbc01ea95ed5f70ea494f4ac98203f271c7f941ca028"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.926920 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xtxw4" event={"ID":"549a8467-81c7-4195-9ead-f0cbdba07d61","Type":"ContainerStarted","Data":"b64e6bbd37d891a6dd5ae6c599e2a272835d8bcaa7893069c74cfb2c1e6ac093"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.927589 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.932322 4693 patch_prober.go:28] interesting pod/console-operator-58897d9998-xtxw4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.932363 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xtxw4" podUID="549a8467-81c7-4195-9ead-f0cbdba07d61" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.941169 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" event={"ID":"12e72fe6-ed8c-4f53-a8ee-47e19a656342","Type":"ContainerStarted","Data":"395d16f3377efc905bd0bb811b91d52631d457f5a91875105efba5c41ac0059b"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.941331 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" podStartSLOduration=119.941305263 podStartE2EDuration="1m59.941305263s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:25.93798281 +0000 UTC m=+145.856068211" watchObservedRunningTime="2025-11-25 12:10:25.941305263 +0000 UTC m=+145.859390644" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.942547 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.942685 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:25 crc kubenswrapper[4693]: E1125 12:10:25.943123 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.44310777 +0000 UTC m=+146.361193151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.944602 4693 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mk5xd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.944635 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" podUID="12e72fe6-ed8c-4f53-a8ee-47e19a656342" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.951189 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" event={"ID":"699627e5-fe61-4db8-885c-07ef0e8fb8fc","Type":"ContainerStarted","Data":"3670be2106fcacf7365644a15484335e041c17d977c115eb67d2d48fec30ddfa"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.966847 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7snhp" event={"ID":"1579bf21-c619-4e37-b0f4-f7b9727daaf5","Type":"ContainerStarted","Data":"02e91910a286ef6d2179ef673c10bb9fa1055fa199457edcd8b57fbf92e92578"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.988078 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" event={"ID":"2a21ef08-fa38-470a-a821-3f39a5d72f23","Type":"ContainerStarted","Data":"532b158b6198e736cc94af4102d4737978ba6cf35d912765213fe8fb9df399c5"} Nov 25 12:10:25 crc kubenswrapper[4693]: I1125 12:10:25.994282 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2ggsx" event={"ID":"9e5a9da0-728f-4181-9912-fa57f2e2fb3a","Type":"ContainerStarted","Data":"0adeab47f21cd466984d3e7d0082efc6754a9d470e8148196ba9938f48e8aa70"} Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:25.999972 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" event={"ID":"9f4fda99-711f-4947-aae3-55186580a3cc","Type":"ContainerStarted","Data":"6356591a37e78ec8fc82ed7a062f8b31dba3f0a95204a9f6e2c57ed99c4dc8f3"} Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.000881 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.003644 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.003690 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.007697 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" event={"ID":"83eaa7e8-5eda-480d-bf44-9bd329c12e8d","Type":"ContainerStarted","Data":"1f0831de2e2fe9253a6560c27d5b00c69cd69bc00302bc92f088263da503e958"} Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.013846 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" podStartSLOduration=120.013827098 podStartE2EDuration="2m0.013827098s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:25.973955349 +0000 UTC m=+145.892040750" watchObservedRunningTime="2025-11-25 12:10:26.013827098 +0000 UTC m=+145.931912479" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.014573 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-skxbw" podStartSLOduration=121.014569622 podStartE2EDuration="2m1.014569622s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.011883038 +0000 UTC m=+145.929968419" watchObservedRunningTime="2025-11-25 12:10:26.014569622 +0000 UTC m=+145.932655003" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.022167 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" event={"ID":"5c3a5def-8ee1-4719-ab67-916e2cddc6c6","Type":"ContainerStarted","Data":"4dce910b0b4166b7f826887412147729f1a48674e02d2d3f1630a1fc25b34308"} Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.033994 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg" event={"ID":"264f1d17-cf59-4dbf-ad2f-0272713fe3b0","Type":"ContainerStarted","Data":"f36b6965cbea69cbdd0ceaeb074934518976580a2e13aa7367963b5a6df6c520"} Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.036873 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nckmj" event={"ID":"cb5dac5c-eefa-4f03-abd2-8619af829eff","Type":"ContainerStarted","Data":"9b7a2560ad97ce622a13152a6d7331eb7d2a84ab40c403f8f54022d97d93e4d9"} Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.043785 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" event={"ID":"559e62da-b780-4e38-95ee-379cc6066ffa","Type":"ContainerStarted","Data":"e0d7d9ecf6e79b6ca10b9d280850850b2774551abfa2ac60e7d0086aa7310da5"} Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.048314 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.050102 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.550087056 +0000 UTC m=+146.468172437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.061061 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" podStartSLOduration=121.061031056 podStartE2EDuration="2m1.061031056s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.05922876 +0000 UTC m=+145.977314161" watchObservedRunningTime="2025-11-25 12:10:26.061031056 +0000 UTC m=+145.979116437" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.074714 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" event={"ID":"0b2a1e7e-a82d-4c81-a2d5-b6d00ccfdd58","Type":"ContainerStarted","Data":"50a2eef4260df4c1c776c5ee8450b82355a9fa9ff2c36f43ab9bbf7042a91812"} Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.100638 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2ggsx" podStartSLOduration=6.100618317 podStartE2EDuration="6.100618317s" podCreationTimestamp="2025-11-25 12:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.10006886 +0000 UTC m=+146.018154251" watchObservedRunningTime="2025-11-25 12:10:26.100618317 +0000 UTC m=+146.018703698" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.149455 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.150643 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.650618622 +0000 UTC m=+146.568704003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.171399 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2mf49" podStartSLOduration=121.171364077 podStartE2EDuration="2m1.171364077s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.134878713 +0000 UTC m=+146.052964094" watchObservedRunningTime="2025-11-25 12:10:26.171364077 +0000 UTC m=+146.089449468" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.213892 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hnh5z" podStartSLOduration=120.213875869 podStartE2EDuration="2m0.213875869s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.173608236 +0000 UTC m=+146.091693617" watchObservedRunningTime="2025-11-25 12:10:26.213875869 +0000 UTC m=+146.131961250" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.214022 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-xtxw4" podStartSLOduration=121.214018123 podStartE2EDuration="2m1.214018123s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.212661651 +0000 UTC m=+146.130747032" watchObservedRunningTime="2025-11-25 12:10:26.214018123 +0000 UTC m=+146.132103504" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.251656 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.252564 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.752547371 +0000 UTC m=+146.670632753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.267020 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-q7dkt" podStartSLOduration=121.26698928 podStartE2EDuration="2m1.26698928s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.256814374 +0000 UTC m=+146.174899755" watchObservedRunningTime="2025-11-25 12:10:26.26698928 +0000 UTC m=+146.185074661" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.296791 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" podStartSLOduration=120.296767216 podStartE2EDuration="2m0.296767216s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.293229716 +0000 UTC m=+146.211315117" watchObservedRunningTime="2025-11-25 12:10:26.296767216 +0000 UTC m=+146.214852597" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.353203 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.353814 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.853771799 +0000 UTC m=+146.771857180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.354613 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.357727 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.857702021 +0000 UTC m=+146.775787402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.381737 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg" podStartSLOduration=120.381709817 podStartE2EDuration="2m0.381709817s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.375405501 +0000 UTC m=+146.293490892" watchObservedRunningTime="2025-11-25 12:10:26.381709817 +0000 UTC m=+146.299795198" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.453162 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v98gn" podStartSLOduration=120.453145858 podStartE2EDuration="2m0.453145858s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.417612003 +0000 UTC m=+146.335697384" watchObservedRunningTime="2025-11-25 12:10:26.453145858 +0000 UTC m=+146.371231229" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.456546 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" podStartSLOduration=120.456529423 podStartE2EDuration="2m0.456529423s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.452475237 +0000 UTC m=+146.370560628" watchObservedRunningTime="2025-11-25 12:10:26.456529423 +0000 UTC m=+146.374614804" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.457084 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.457533 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:26.957518144 +0000 UTC m=+146.875603525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.537996 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-4b2tf" podStartSLOduration=121.537974765 podStartE2EDuration="2m1.537974765s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.534215119 +0000 UTC m=+146.452300510" watchObservedRunningTime="2025-11-25 12:10:26.537974765 +0000 UTC m=+146.456060156" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.559000 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.559445 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.059432093 +0000 UTC m=+146.977517474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.573541 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7lrpg" podStartSLOduration=121.573516541 podStartE2EDuration="2m1.573516541s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.572926382 +0000 UTC m=+146.491011763" watchObservedRunningTime="2025-11-25 12:10:26.573516541 +0000 UTC m=+146.491601932" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.615121 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zkt9v" podStartSLOduration=120.615103784 podStartE2EDuration="2m0.615103784s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.61464649 +0000 UTC m=+146.532731871" watchObservedRunningTime="2025-11-25 12:10:26.615103784 +0000 UTC m=+146.533189165" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.660189 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.660293 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.160276299 +0000 UTC m=+147.078361680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.660522 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.661074 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.161065863 +0000 UTC m=+147.079151234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.697878 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-b7p2s" podStartSLOduration=120.697851997 podStartE2EDuration="2m0.697851997s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.655200891 +0000 UTC m=+146.573286282" watchObservedRunningTime="2025-11-25 12:10:26.697851997 +0000 UTC m=+146.615937388" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.701341 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-n884p" podStartSLOduration=6.701317434 podStartE2EDuration="6.701317434s" podCreationTimestamp="2025-11-25 12:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:26.698686753 +0000 UTC m=+146.616772134" watchObservedRunningTime="2025-11-25 12:10:26.701317434 +0000 UTC m=+146.619402815" Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.761510 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.761642 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.26162308 +0000 UTC m=+147.179708461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.762112 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.762510 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.262502627 +0000 UTC m=+147.180588008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.863432 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.863613 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.36358342 +0000 UTC m=+147.281668801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.863749 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.864252 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.36424426 +0000 UTC m=+147.282329641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.965173 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.965347 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.465314013 +0000 UTC m=+147.383399414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:26 crc kubenswrapper[4693]: I1125 12:10:26.965824 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:26 crc kubenswrapper[4693]: E1125 12:10:26.966202 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.46618848 +0000 UTC m=+147.384274061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.002902 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.002983 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.067686 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:27 crc kubenswrapper[4693]: E1125 12:10:27.068364 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.568348436 +0000 UTC m=+147.486433817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.081661 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5v4pg" event={"ID":"264f1d17-cf59-4dbf-ad2f-0272713fe3b0","Type":"ContainerStarted","Data":"8e4c280cf180890623ec5a9291e6977e1d4db86e3fc21fc8733e71d297d60f98"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.082910 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" event={"ID":"e5113876-cf94-45b5-9edc-e4ac8af59cb9","Type":"ContainerStarted","Data":"e70cef117018b355017be9ddb870e1b323697d41ef5e59d9560d14e93fe4165c"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.083886 4693 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jf2cd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.083923 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" podUID="e5113876-cf94-45b5-9edc-e4ac8af59cb9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.085541 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" event={"ID":"64d12306-ed10-4a16-8c2b-941bfafaa705","Type":"ContainerStarted","Data":"d7432cbc443d3da2a4f9b9e91b41316da28fe3cb7f0791d32784c12692783d3a"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.085579 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.087238 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr" event={"ID":"18eaddfe-4ea4-4581-afe0-b778eb74ff49","Type":"ContainerStarted","Data":"9f6b8678a3ee11bc76d2516cef29d3c107ae8835c419cda87a0469693aee6955"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.087270 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr" event={"ID":"18eaddfe-4ea4-4581-afe0-b778eb74ff49","Type":"ContainerStarted","Data":"06b8cb4e78f3cba9d497ca3318e6db29c00092ca21aa933ec02c0b5e64153df0"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.090198 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7snhp" event={"ID":"1579bf21-c619-4e37-b0f4-f7b9727daaf5","Type":"ContainerStarted","Data":"bf7629d64b2f3469d7635760614b5f19c9d4093442eef667633bf6c77d20d388"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.093569 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2ggsx" event={"ID":"9e5a9da0-728f-4181-9912-fa57f2e2fb3a","Type":"ContainerStarted","Data":"5ca74be7bde2bf8a316c678106525c9280921b96c691256cd463bccaa909f05c"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.096231 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" event={"ID":"bccd7dbe-e658-4ce4-be99-b6642a5bb498","Type":"ContainerStarted","Data":"08c084651b67cb73842fef862eb4b767d01220b9aa4b8d11b9379cc7313aa258"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.098664 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" event={"ID":"e45bad22-7711-44b5-a425-cdc54a795feb","Type":"ContainerStarted","Data":"925a9de5b11958e873255a32011310f821c869e33665d6eac0bcbd2435eff3e0"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.101071 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" event={"ID":"83eaa7e8-5eda-480d-bf44-9bd329c12e8d","Type":"ContainerStarted","Data":"5c82f10f085a5266445f8a65d1c3740031b4ff72ce36a5caaa0f69870f879b85"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.103146 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" event={"ID":"1c77bff0-0391-4a39-890a-8a81b2924f91","Type":"ContainerStarted","Data":"9b9019be26738001b4c3a089ad318d13c3e768f0c396730021256235b1a373ec"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.104685 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" event={"ID":"d1a1d076-b5f5-41c5-89d5-a6975b170f07","Type":"ContainerStarted","Data":"60f4d6838a83053b2af5b5e09b595892c06b423c96c86b768a80fab6dd23ebe1"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.107133 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" event={"ID":"75594753-ed30-49c9-b2ee-b63e64782ab3","Type":"ContainerStarted","Data":"93e0b915412f3a1c441e81340aad2c1e4e177f611f3db1c990119647f9fa1fda"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.109127 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" event={"ID":"5c3a5def-8ee1-4719-ab67-916e2cddc6c6","Type":"ContainerStarted","Data":"b256a9157a844fb1488d6640dfeb219f6ec5876d79903788eaf0a329c22b84bb"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.111577 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jtvmf" event={"ID":"e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba","Type":"ContainerStarted","Data":"5206b641be3a5de9d8d9a8a06a684e84915e85b8f610dee645aab03ee2b19d32"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.111604 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jtvmf" event={"ID":"e8ba2980-9fa8-4674-a3c9-b10f0b4df7ba","Type":"ContainerStarted","Data":"2595a16332da72b293148000779484f6294ffaf7504ad19ea18e5e03d4a483b1"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.112009 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.114308 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" event={"ID":"da1a8ea0-27fe-434b-9d60-641c1645b75b","Type":"ContainerStarted","Data":"f78f958855380fcba618cb5cddacfd350a6ce5c26320efe0476696c78d345bc9"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.116510 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" podStartSLOduration=122.116500533 podStartE2EDuration="2m2.116500533s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.11352864 +0000 UTC m=+147.031614021" watchObservedRunningTime="2025-11-25 12:10:27.116500533 +0000 UTC m=+147.034585914" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.117600 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" event={"ID":"51398f58-1dab-4bf2-a7dc-b8669a515200","Type":"ContainerStarted","Data":"b2420a2053ec3edfc37e1f20b2d5a208baa316df7cb403a76a3ee2c20d2726f0"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.118301 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.121052 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mpmjj" event={"ID":"9c2633b3-7860-4862-85de-77bcc6732a6c","Type":"ContainerStarted","Data":"47f8935128ddbe2f6146091c03da93b24f50bf836c79bc25d1705356ae2b543a"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.121077 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mpmjj" event={"ID":"9c2633b3-7860-4862-85de-77bcc6732a6c","Type":"ContainerStarted","Data":"25a409bc9874ae3a948ba5fb92d110046ebfb002bbc9584ebd93064fa3a85b34"} Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.121868 4693 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7qd8t container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.121908 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" podUID="667a5672-0a53-472e-a366-1b66dbbc2189" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.122693 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.122719 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.123993 4693 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-q8bsw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.124023 4693 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mk5xd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.124030 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" podUID="b473ce6c-f37a-472a-a1f2-89332034cdee" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.124050 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" podUID="12e72fe6-ed8c-4f53-a8ee-47e19a656342" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.124062 4693 patch_prober.go:28] interesting pod/console-operator-58897d9998-xtxw4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.124124 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xtxw4" podUID="549a8467-81c7-4195-9ead-f0cbdba07d61" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.124259 4693 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-zm7pc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.124307 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" podUID="c3bdbfb7-27fc-41d4-a157-36363c246c38" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.124508 4693 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-52nbn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.124553 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" podUID="413025b6-a706-4ad3-b920-2c9929ddaa0e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.143467 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qwmhz" podStartSLOduration=121.143444821 podStartE2EDuration="2m1.143444821s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.141592674 +0000 UTC m=+147.059678065" watchObservedRunningTime="2025-11-25 12:10:27.143444821 +0000 UTC m=+147.061530202" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.175744 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.178315 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qwhs4" podStartSLOduration=122.178283624 podStartE2EDuration="2m2.178283624s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.17525369 +0000 UTC m=+147.093339071" watchObservedRunningTime="2025-11-25 12:10:27.178283624 +0000 UTC m=+147.096368995" Nov 25 12:10:27 crc kubenswrapper[4693]: E1125 12:10:27.182353 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.68233358 +0000 UTC m=+147.600418961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.229103 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wj9xr" podStartSLOduration=122.229081953 podStartE2EDuration="2m2.229081953s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.22641329 +0000 UTC m=+147.144498671" watchObservedRunningTime="2025-11-25 12:10:27.229081953 +0000 UTC m=+147.147167334" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.255552 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-t4r4s" podStartSLOduration=121.255531176 podStartE2EDuration="2m1.255531176s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.254358599 +0000 UTC m=+147.172443980" watchObservedRunningTime="2025-11-25 12:10:27.255531176 +0000 UTC m=+147.173616557" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.284136 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:27 crc kubenswrapper[4693]: E1125 12:10:27.284623 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.78460158 +0000 UTC m=+147.702686961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.309608 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-7snhp" podStartSLOduration=122.309583197 podStartE2EDuration="2m2.309583197s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.283805925 +0000 UTC m=+147.201891306" watchObservedRunningTime="2025-11-25 12:10:27.309583197 +0000 UTC m=+147.227668578" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.331995 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-gstv2" podStartSLOduration=121.331973903 podStartE2EDuration="2m1.331973903s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.331211109 +0000 UTC m=+147.249296500" watchObservedRunningTime="2025-11-25 12:10:27.331973903 +0000 UTC m=+147.250059284" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.333741 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jtvmf" podStartSLOduration=7.333735148 podStartE2EDuration="7.333735148s" podCreationTimestamp="2025-11-25 12:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.310918028 +0000 UTC m=+147.229003409" watchObservedRunningTime="2025-11-25 12:10:27.333735148 +0000 UTC m=+147.251820529" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.388439 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:27 crc kubenswrapper[4693]: E1125 12:10:27.388765 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.888752258 +0000 UTC m=+147.806837639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.395673 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8jmkl" podStartSLOduration=121.395651853 podStartE2EDuration="2m1.395651853s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.363629777 +0000 UTC m=+147.281715148" watchObservedRunningTime="2025-11-25 12:10:27.395651853 +0000 UTC m=+147.313737234" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.426064 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mkt5h" podStartSLOduration=121.426043818 podStartE2EDuration="2m1.426043818s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.397151219 +0000 UTC m=+147.315236600" watchObservedRunningTime="2025-11-25 12:10:27.426043818 +0000 UTC m=+147.344129199" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.426350 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" podStartSLOduration=121.426345737 podStartE2EDuration="2m1.426345737s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.426275865 +0000 UTC m=+147.344361236" watchObservedRunningTime="2025-11-25 12:10:27.426345737 +0000 UTC m=+147.344431118" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.492163 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" podStartSLOduration=122.492116042 podStartE2EDuration="2m2.492116042s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.467695813 +0000 UTC m=+147.385781194" watchObservedRunningTime="2025-11-25 12:10:27.492116042 +0000 UTC m=+147.410201423" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.492787 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.492888 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mpmjj" podStartSLOduration=121.492884436 podStartE2EDuration="2m1.492884436s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.491778002 +0000 UTC m=+147.409863383" watchObservedRunningTime="2025-11-25 12:10:27.492884436 +0000 UTC m=+147.410969817" Nov 25 12:10:27 crc kubenswrapper[4693]: E1125 12:10:27.493592 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:27.993571438 +0000 UTC m=+147.911656819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.520949 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" podStartSLOduration=121.520929088 podStartE2EDuration="2m1.520929088s" podCreationTimestamp="2025-11-25 12:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.519856155 +0000 UTC m=+147.437941546" watchObservedRunningTime="2025-11-25 12:10:27.520929088 +0000 UTC m=+147.439014469" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.548276 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-p8tpr" podStartSLOduration=122.548259928 podStartE2EDuration="2m2.548259928s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:27.545739849 +0000 UTC m=+147.463825240" watchObservedRunningTime="2025-11-25 12:10:27.548259928 +0000 UTC m=+147.466345309" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.595387 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:27 crc kubenswrapper[4693]: E1125 12:10:27.595854 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.095837127 +0000 UTC m=+148.013922508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.696931 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:27 crc kubenswrapper[4693]: E1125 12:10:27.697150 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.197113496 +0000 UTC m=+148.115198877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.697680 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:27 crc kubenswrapper[4693]: E1125 12:10:27.698100 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.198089467 +0000 UTC m=+148.116175028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.705296 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.705923 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.708025 4693 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-v5rkj container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.708184 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" podUID="da1a8ea0-27fe-434b-9d60-641c1645b75b" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.799110 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:27 crc kubenswrapper[4693]: E1125 12:10:27.799393 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.299323094 +0000 UTC m=+148.217408475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.799565 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:27 crc kubenswrapper[4693]: E1125 12:10:27.799958 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.299942824 +0000 UTC m=+148.218028365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:27 crc kubenswrapper[4693]: I1125 12:10:27.900709 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:27 crc kubenswrapper[4693]: E1125 12:10:27.901266 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.401245753 +0000 UTC m=+148.319331134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.002590 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.003215 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.503195394 +0000 UTC m=+148.421280765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.007602 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:28 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:28 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:28 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.007671 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.104495 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.104742 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.604708319 +0000 UTC m=+148.522793700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.104960 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.105346 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.605335859 +0000 UTC m=+148.523421240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.126698 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nckmj" event={"ID":"cb5dac5c-eefa-4f03-abd2-8619af829eff","Type":"ContainerStarted","Data":"cec3c022003ecee18572a16452958ed783d3575a672edd104f2c073e2b7a0437"} Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.129625 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" event={"ID":"75594753-ed30-49c9-b2ee-b63e64782ab3","Type":"ContainerStarted","Data":"8f0b765c59ab87f7db085f3e5ce157f1b1370aa0b50d06208988bfcefe3a2b2e"} Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.132301 4693 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-q8bsw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.132352 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" podUID="b473ce6c-f37a-472a-a1f2-89332034cdee" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.141860 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mk5xd" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.143336 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.159999 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" podStartSLOduration=123.159972388 podStartE2EDuration="2m3.159972388s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:28.15780046 +0000 UTC m=+148.075885841" watchObservedRunningTime="2025-11-25 12:10:28.159972388 +0000 UTC m=+148.078057769" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.170505 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7qd8t" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.207510 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.210248 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.710223481 +0000 UTC m=+148.628308862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.218506 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.311916 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.312339 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.812326986 +0000 UTC m=+148.730412367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.413534 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.413728 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.913697857 +0000 UTC m=+148.831783238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.414263 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.414674 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:28.914661317 +0000 UTC m=+148.832746698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.516493 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.516748 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:29.016718981 +0000 UTC m=+148.934804362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.517112 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.517498 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:29.017490304 +0000 UTC m=+148.935575685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.619128 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.619544 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:29.119519507 +0000 UTC m=+149.037604888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.721582 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.722064 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.722173 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.722259 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.722341 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.722685 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:29.222673034 +0000 UTC m=+149.140758405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.722785 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.728390 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.746199 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.746989 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.825978 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.826597 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:29.326578785 +0000 UTC m=+149.244664166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.836365 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jf2cd" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.850645 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.867336 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.876690 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.931385 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:28 crc kubenswrapper[4693]: E1125 12:10:28.931807 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:29.431794726 +0000 UTC m=+149.349880107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.942261 4693 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-ptflj container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 25 12:10:28 crc kubenswrapper[4693]: [+]log ok Nov 25 12:10:28 crc kubenswrapper[4693]: [-]poststarthook/max-in-flight-filter failed: reason withheld Nov 25 12:10:28 crc kubenswrapper[4693]: [-]poststarthook/storage-object-count-tracker-hook failed: reason withheld Nov 25 12:10:28 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:28 crc kubenswrapper[4693]: I1125 12:10:28.942533 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" podUID="64d12306-ed10-4a16-8c2b-941bfafaa705" containerName="openshift-config-operator" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.018636 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:29 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:29 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:29 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.018703 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.030243 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ptflj" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.035274 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:29 crc kubenswrapper[4693]: E1125 12:10:29.035758 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:29.535742788 +0000 UTC m=+149.453828169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.136783 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:29 crc kubenswrapper[4693]: E1125 12:10:29.137328 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:29.637308687 +0000 UTC m=+149.555394068 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.239101 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:29 crc kubenswrapper[4693]: E1125 12:10:29.239490 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:29.739460233 +0000 UTC m=+149.657545614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.240027 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:29 crc kubenswrapper[4693]: E1125 12:10:29.243263 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:29.743249551 +0000 UTC m=+149.661334922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.342781 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:29 crc kubenswrapper[4693]: E1125 12:10:29.343411 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:29.843395154 +0000 UTC m=+149.761480535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.412670 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dv9rw"] Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.413693 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.436980 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.446058 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:29 crc kubenswrapper[4693]: E1125 12:10:29.446466 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:29.946450549 +0000 UTC m=+149.864535920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.454721 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dv9rw"] Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.553063 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.553315 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa13974-bf19-4195-af55-6dec3741828d-utilities\") pod \"certified-operators-dv9rw\" (UID: \"9fa13974-bf19-4195-af55-6dec3741828d\") " pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.553357 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4mwx\" (UniqueName: \"kubernetes.io/projected/9fa13974-bf19-4195-af55-6dec3741828d-kube-api-access-x4mwx\") pod \"certified-operators-dv9rw\" (UID: \"9fa13974-bf19-4195-af55-6dec3741828d\") " pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.553432 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa13974-bf19-4195-af55-6dec3741828d-catalog-content\") pod \"certified-operators-dv9rw\" (UID: \"9fa13974-bf19-4195-af55-6dec3741828d\") " pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:10:29 crc kubenswrapper[4693]: E1125 12:10:29.553572 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:30.053552528 +0000 UTC m=+149.971637909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.631137 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gdgjj"] Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.662180 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4mwx\" (UniqueName: \"kubernetes.io/projected/9fa13974-bf19-4195-af55-6dec3741828d-kube-api-access-x4mwx\") pod \"certified-operators-dv9rw\" (UID: \"9fa13974-bf19-4195-af55-6dec3741828d\") " pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.662262 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa13974-bf19-4195-af55-6dec3741828d-catalog-content\") pod \"certified-operators-dv9rw\" (UID: \"9fa13974-bf19-4195-af55-6dec3741828d\") " pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.662326 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.662401 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa13974-bf19-4195-af55-6dec3741828d-utilities\") pod \"certified-operators-dv9rw\" (UID: \"9fa13974-bf19-4195-af55-6dec3741828d\") " pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.663034 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa13974-bf19-4195-af55-6dec3741828d-utilities\") pod \"certified-operators-dv9rw\" (UID: \"9fa13974-bf19-4195-af55-6dec3741828d\") " pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.663391 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa13974-bf19-4195-af55-6dec3741828d-catalog-content\") pod \"certified-operators-dv9rw\" (UID: \"9fa13974-bf19-4195-af55-6dec3741828d\") " pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:10:29 crc kubenswrapper[4693]: E1125 12:10:29.664139 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:30.164123246 +0000 UTC m=+150.082208627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.665569 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.679406 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.737401 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gdgjj"] Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.764198 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.764850 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzbph\" (UniqueName: \"kubernetes.io/projected/dd350aee-2f89-4b9f-ad62-454d6376a81f-kube-api-access-rzbph\") pod \"community-operators-gdgjj\" (UID: \"dd350aee-2f89-4b9f-ad62-454d6376a81f\") " pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.764915 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd350aee-2f89-4b9f-ad62-454d6376a81f-catalog-content\") pod \"community-operators-gdgjj\" (UID: \"dd350aee-2f89-4b9f-ad62-454d6376a81f\") " pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.764951 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd350aee-2f89-4b9f-ad62-454d6376a81f-utilities\") pod \"community-operators-gdgjj\" (UID: \"dd350aee-2f89-4b9f-ad62-454d6376a81f\") " pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:10:29 crc kubenswrapper[4693]: E1125 12:10:29.765078 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:30.265062075 +0000 UTC m=+150.183147456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.767906 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4mwx\" (UniqueName: \"kubernetes.io/projected/9fa13974-bf19-4195-af55-6dec3741828d-kube-api-access-x4mwx\") pod \"certified-operators-dv9rw\" (UID: \"9fa13974-bf19-4195-af55-6dec3741828d\") " pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.866977 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd350aee-2f89-4b9f-ad62-454d6376a81f-catalog-content\") pod \"community-operators-gdgjj\" (UID: \"dd350aee-2f89-4b9f-ad62-454d6376a81f\") " pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.867039 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd350aee-2f89-4b9f-ad62-454d6376a81f-catalog-content\") pod \"community-operators-gdgjj\" (UID: \"dd350aee-2f89-4b9f-ad62-454d6376a81f\") " pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.867085 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.867114 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd350aee-2f89-4b9f-ad62-454d6376a81f-utilities\") pod \"community-operators-gdgjj\" (UID: \"dd350aee-2f89-4b9f-ad62-454d6376a81f\") " pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.867231 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzbph\" (UniqueName: \"kubernetes.io/projected/dd350aee-2f89-4b9f-ad62-454d6376a81f-kube-api-access-rzbph\") pod \"community-operators-gdgjj\" (UID: \"dd350aee-2f89-4b9f-ad62-454d6376a81f\") " pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.867764 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd350aee-2f89-4b9f-ad62-454d6376a81f-utilities\") pod \"community-operators-gdgjj\" (UID: \"dd350aee-2f89-4b9f-ad62-454d6376a81f\") " pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:10:29 crc kubenswrapper[4693]: E1125 12:10:29.868066 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:30.368051547 +0000 UTC m=+150.286136928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.968237 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kwsql"] Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.968929 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:29 crc kubenswrapper[4693]: E1125 12:10:29.969314 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:30.469297335 +0000 UTC m=+150.387382716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.973162 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:10:29 crc kubenswrapper[4693]: I1125 12:10:29.990058 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzbph\" (UniqueName: \"kubernetes.io/projected/dd350aee-2f89-4b9f-ad62-454d6376a81f-kube-api-access-rzbph\") pod \"community-operators-gdgjj\" (UID: \"dd350aee-2f89-4b9f-ad62-454d6376a81f\") " pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.012539 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kwsql"] Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.043661 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:30 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:30 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:30 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.043731 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.050719 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.068515 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b7mtn"] Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.069641 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.078896 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.078989 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-catalog-content\") pod \"certified-operators-kwsql\" (UID: \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\") " pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.079013 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdwfm\" (UniqueName: \"kubernetes.io/projected/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-kube-api-access-fdwfm\") pod \"certified-operators-kwsql\" (UID: \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\") " pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.079040 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-utilities\") pod \"certified-operators-kwsql\" (UID: \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\") " pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:10:30 crc kubenswrapper[4693]: E1125 12:10:30.079343 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:30.579331066 +0000 UTC m=+150.497416447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.092706 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.099363 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b7mtn"] Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.181040 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.181275 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-utilities\") pod \"community-operators-b7mtn\" (UID: \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\") " pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.181305 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-catalog-content\") pod \"certified-operators-kwsql\" (UID: \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\") " pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.181326 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdwfm\" (UniqueName: \"kubernetes.io/projected/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-kube-api-access-fdwfm\") pod \"certified-operators-kwsql\" (UID: \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\") " pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.181351 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-catalog-content\") pod \"community-operators-b7mtn\" (UID: \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\") " pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.181399 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-utilities\") pod \"certified-operators-kwsql\" (UID: \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\") " pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.181458 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-826z7\" (UniqueName: \"kubernetes.io/projected/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-kube-api-access-826z7\") pod \"community-operators-b7mtn\" (UID: \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\") " pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:10:30 crc kubenswrapper[4693]: E1125 12:10:30.181576 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:30.681559045 +0000 UTC m=+150.599644416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.181919 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-catalog-content\") pod \"certified-operators-kwsql\" (UID: \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\") " pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.182111 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-utilities\") pod \"certified-operators-kwsql\" (UID: \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\") " pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.189689 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nckmj" event={"ID":"cb5dac5c-eefa-4f03-abd2-8619af829eff","Type":"ContainerStarted","Data":"39b8aef163400d8f34560394c6477d9146b091792a4d491193e1670a85b9abf3"} Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.202618 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"703c829514051772430a3f8c7acca5ecdc1f323400110fe1e26fa3073b333b5b"} Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.220244 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdwfm\" (UniqueName: \"kubernetes.io/projected/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-kube-api-access-fdwfm\") pod \"certified-operators-kwsql\" (UID: \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\") " pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.236616 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9e759ed76ab70c9395eeb0e99dd12e20e52156d3878571bfb2ae0aa6dd1c1e00"} Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.236673 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ad87e9fec6ae9be24f55c6e6f963cb1e3bd0a20c63e10fbf525e6139335be3cf"} Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.237617 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.263624 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8b9f145127536f1f7a6311baa2fc02403beb1b2f1fb35e4f76693e3d60945acc"} Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.284292 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-826z7\" (UniqueName: \"kubernetes.io/projected/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-kube-api-access-826z7\") pod \"community-operators-b7mtn\" (UID: \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\") " pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.284397 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-utilities\") pod \"community-operators-b7mtn\" (UID: \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\") " pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.284422 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-catalog-content\") pod \"community-operators-b7mtn\" (UID: \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\") " pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.284458 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:30 crc kubenswrapper[4693]: E1125 12:10:30.284771 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:30.784749912 +0000 UTC m=+150.702835293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.285468 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-utilities\") pod \"community-operators-b7mtn\" (UID: \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\") " pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.285765 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-catalog-content\") pod \"community-operators-b7mtn\" (UID: \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\") " pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.319217 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.361132 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-826z7\" (UniqueName: \"kubernetes.io/projected/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-kube-api-access-826z7\") pod \"community-operators-b7mtn\" (UID: \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\") " pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.390092 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:30 crc kubenswrapper[4693]: E1125 12:10:30.391313 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:30.891295275 +0000 UTC m=+150.809380646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.470090 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.492826 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:30 crc kubenswrapper[4693]: E1125 12:10:30.493163 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:30.993150902 +0000 UTC m=+150.911236283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.530747 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.531498 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.539859 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.539942 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.541898 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.595931 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.596621 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bd1119f-152e-4667-bf56-5c22052b29b0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6bd1119f-152e-4667-bf56-5c22052b29b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.596699 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bd1119f-152e-4667-bf56-5c22052b29b0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6bd1119f-152e-4667-bf56-5c22052b29b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 12:10:30 crc kubenswrapper[4693]: E1125 12:10:30.596841 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:31.096820525 +0000 UTC m=+151.014905906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.627699 4693 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.697742 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bd1119f-152e-4667-bf56-5c22052b29b0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6bd1119f-152e-4667-bf56-5c22052b29b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.697791 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.697840 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bd1119f-152e-4667-bf56-5c22052b29b0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6bd1119f-152e-4667-bf56-5c22052b29b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.697932 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bd1119f-152e-4667-bf56-5c22052b29b0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6bd1119f-152e-4667-bf56-5c22052b29b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 12:10:30 crc kubenswrapper[4693]: E1125 12:10:30.698812 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:31.198794906 +0000 UTC m=+151.116880287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.736574 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bd1119f-152e-4667-bf56-5c22052b29b0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6bd1119f-152e-4667-bf56-5c22052b29b0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.800245 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:30 crc kubenswrapper[4693]: E1125 12:10:30.800711 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:31.300693605 +0000 UTC m=+151.218778986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.822140 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gdgjj"] Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.856216 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dv9rw"] Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.900409 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.901885 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:30 crc kubenswrapper[4693]: E1125 12:10:30.905037 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:31.405023578 +0000 UTC m=+151.323108959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:30 crc kubenswrapper[4693]: I1125 12:10:30.985518 4693 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-25T12:10:30.627729116Z","Handler":null,"Name":""} Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.005536 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:31 crc kubenswrapper[4693]: E1125 12:10:31.005656 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-25 12:10:31.505637207 +0000 UTC m=+151.423722588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.006134 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:31 crc kubenswrapper[4693]: E1125 12:10:31.006612 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-25 12:10:31.506591767 +0000 UTC m=+151.424677148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cg5pd" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.019032 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:31 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:31 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:31 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.019102 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.032323 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kwsql"] Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.038934 4693 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.038980 4693 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 25 12:10:31 crc kubenswrapper[4693]: W1125 12:10:31.069190 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc89528b4_89fc_4cc4_a2a9_b70683bbbf51.slice/crio-7683cd75c23691e16e0493d80ea048b259381064209e4f5353eea01d65a13036 WatchSource:0}: Error finding container 7683cd75c23691e16e0493d80ea048b259381064209e4f5353eea01d65a13036: Status 404 returned error can't find the container with id 7683cd75c23691e16e0493d80ea048b259381064209e4f5353eea01d65a13036 Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.107691 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.116941 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b7mtn"] Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.127060 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 25 12:10:31 crc kubenswrapper[4693]: W1125 12:10:31.137317 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9dd938a_f1c9_48f8_96ef_470405bfa9e4.slice/crio-d52842082391869602976b3fe7192060523ccd38595c7a0d774ed1a98a6676c5 WatchSource:0}: Error finding container d52842082391869602976b3fe7192060523ccd38595c7a0d774ed1a98a6676c5: Status 404 returned error can't find the container with id d52842082391869602976b3fe7192060523ccd38595c7a0d774ed1a98a6676c5 Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.214683 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.251224 4693 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.251291 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.276882 4693 generic.go:334] "Generic (PLEG): container finished" podID="dd350aee-2f89-4b9f-ad62-454d6376a81f" containerID="7e2bff4f1f091ecb1774f0fe91be1b5ed4f4980a20e6ba77a2c4d45bd74b205d" exitCode=0 Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.276996 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdgjj" event={"ID":"dd350aee-2f89-4b9f-ad62-454d6376a81f","Type":"ContainerDied","Data":"7e2bff4f1f091ecb1774f0fe91be1b5ed4f4980a20e6ba77a2c4d45bd74b205d"} Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.277058 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdgjj" event={"ID":"dd350aee-2f89-4b9f-ad62-454d6376a81f","Type":"ContainerStarted","Data":"9fd3e1ef1ee46a89d8dab382de1ca25e1ac73949c31fde625eaa0d34b9348008"} Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.282657 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.290431 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"be480a9efedc4e98fa6ce404381811a9ae588169408d7e59a87f25220bdfce98"} Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.330497 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nckmj" event={"ID":"cb5dac5c-eefa-4f03-abd2-8619af829eff","Type":"ContainerStarted","Data":"b24f7d9a2833cdb9652397e5fa0b2936cb420a27d38c6fbb4a744d6ec87affa7"} Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.330559 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nckmj" event={"ID":"cb5dac5c-eefa-4f03-abd2-8619af829eff","Type":"ContainerStarted","Data":"dfb20ac9f3d9565034d16db1b2b91f8e28a8d4dab773d01fb2eb24a78630725f"} Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.347439 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cg5pd\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.353558 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwsql" event={"ID":"c89528b4-89fc-4cc4-a2a9-b70683bbbf51","Type":"ContainerStarted","Data":"7683cd75c23691e16e0493d80ea048b259381064209e4f5353eea01d65a13036"} Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.370746 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.391988 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"11cde010d5c893c7db110ec22ad9decaec08fe725c0abc2a7d2a02cece06db1a"} Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.419962 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.447152 4693 generic.go:334] "Generic (PLEG): container finished" podID="9fa13974-bf19-4195-af55-6dec3741828d" containerID="b369e877c242671d5e8b990f5efa698a19177c45f532992f4f8330f604893d31" exitCode=0 Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.447696 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9rw" event={"ID":"9fa13974-bf19-4195-af55-6dec3741828d","Type":"ContainerDied","Data":"b369e877c242671d5e8b990f5efa698a19177c45f532992f4f8330f604893d31"} Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.455174 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9rw" event={"ID":"9fa13974-bf19-4195-af55-6dec3741828d","Type":"ContainerStarted","Data":"2eeb147c5031c81c375681b647309abed069424ee07d6259d7d52179c1776060"} Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.469610 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7mtn" event={"ID":"a9dd938a-f1c9-48f8-96ef-470405bfa9e4","Type":"ContainerStarted","Data":"d52842082391869602976b3fe7192060523ccd38595c7a0d774ed1a98a6676c5"} Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.523109 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-nckmj" podStartSLOduration=11.523089606 podStartE2EDuration="11.523089606s" podCreationTimestamp="2025-11-25 12:10:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:31.513215509 +0000 UTC m=+151.431300890" watchObservedRunningTime="2025-11-25 12:10:31.523089606 +0000 UTC m=+151.441174987" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.606034 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sfz7v"] Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.607410 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.624278 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.641419 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfz7v"] Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.728304 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/631e327a-21ca-4eb7-ab17-dd80766e4055-utilities\") pod \"redhat-marketplace-sfz7v\" (UID: \"631e327a-21ca-4eb7-ab17-dd80766e4055\") " pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.728393 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjfhh\" (UniqueName: \"kubernetes.io/projected/631e327a-21ca-4eb7-ab17-dd80766e4055-kube-api-access-kjfhh\") pod \"redhat-marketplace-sfz7v\" (UID: \"631e327a-21ca-4eb7-ab17-dd80766e4055\") " pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.728492 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/631e327a-21ca-4eb7-ab17-dd80766e4055-catalog-content\") pod \"redhat-marketplace-sfz7v\" (UID: \"631e327a-21ca-4eb7-ab17-dd80766e4055\") " pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.829451 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjfhh\" (UniqueName: \"kubernetes.io/projected/631e327a-21ca-4eb7-ab17-dd80766e4055-kube-api-access-kjfhh\") pod \"redhat-marketplace-sfz7v\" (UID: \"631e327a-21ca-4eb7-ab17-dd80766e4055\") " pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.829559 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/631e327a-21ca-4eb7-ab17-dd80766e4055-catalog-content\") pod \"redhat-marketplace-sfz7v\" (UID: \"631e327a-21ca-4eb7-ab17-dd80766e4055\") " pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.829601 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/631e327a-21ca-4eb7-ab17-dd80766e4055-utilities\") pod \"redhat-marketplace-sfz7v\" (UID: \"631e327a-21ca-4eb7-ab17-dd80766e4055\") " pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.830040 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/631e327a-21ca-4eb7-ab17-dd80766e4055-utilities\") pod \"redhat-marketplace-sfz7v\" (UID: \"631e327a-21ca-4eb7-ab17-dd80766e4055\") " pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.830584 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/631e327a-21ca-4eb7-ab17-dd80766e4055-catalog-content\") pod \"redhat-marketplace-sfz7v\" (UID: \"631e327a-21ca-4eb7-ab17-dd80766e4055\") " pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.854305 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjfhh\" (UniqueName: \"kubernetes.io/projected/631e327a-21ca-4eb7-ab17-dd80766e4055-kube-api-access-kjfhh\") pod \"redhat-marketplace-sfz7v\" (UID: \"631e327a-21ca-4eb7-ab17-dd80766e4055\") " pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.982757 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p2w4j"] Nov 25 12:10:31 crc kubenswrapper[4693]: I1125 12:10:31.984361 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.001425 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2w4j"] Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.005676 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:32 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:32 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:32 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.005759 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.031743 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.032984 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2c46bf-6877-4fe0-a0af-38c7784f7536-utilities\") pod \"redhat-marketplace-p2w4j\" (UID: \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\") " pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.033160 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6675p\" (UniqueName: \"kubernetes.io/projected/6b2c46bf-6877-4fe0-a0af-38c7784f7536-kube-api-access-6675p\") pod \"redhat-marketplace-p2w4j\" (UID: \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\") " pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.033580 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2c46bf-6877-4fe0-a0af-38c7784f7536-catalog-content\") pod \"redhat-marketplace-p2w4j\" (UID: \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\") " pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.138232 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2c46bf-6877-4fe0-a0af-38c7784f7536-catalog-content\") pod \"redhat-marketplace-p2w4j\" (UID: \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\") " pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.138323 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2c46bf-6877-4fe0-a0af-38c7784f7536-utilities\") pod \"redhat-marketplace-p2w4j\" (UID: \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\") " pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.138345 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6675p\" (UniqueName: \"kubernetes.io/projected/6b2c46bf-6877-4fe0-a0af-38c7784f7536-kube-api-access-6675p\") pod \"redhat-marketplace-p2w4j\" (UID: \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\") " pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.138796 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2c46bf-6877-4fe0-a0af-38c7784f7536-utilities\") pod \"redhat-marketplace-p2w4j\" (UID: \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\") " pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.139038 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2c46bf-6877-4fe0-a0af-38c7784f7536-catalog-content\") pod \"redhat-marketplace-p2w4j\" (UID: \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\") " pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.164280 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6675p\" (UniqueName: \"kubernetes.io/projected/6b2c46bf-6877-4fe0-a0af-38c7784f7536-kube-api-access-6675p\") pod \"redhat-marketplace-p2w4j\" (UID: \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\") " pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.192349 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cg5pd"] Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.302899 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.363796 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.363859 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.370365 4693 patch_prober.go:28] interesting pod/apiserver-76f77b778f-c4h8g container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 25 12:10:32 crc kubenswrapper[4693]: [+]log ok Nov 25 12:10:32 crc kubenswrapper[4693]: [+]etcd ok Nov 25 12:10:32 crc kubenswrapper[4693]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 25 12:10:32 crc kubenswrapper[4693]: [+]poststarthook/generic-apiserver-start-informers ok Nov 25 12:10:32 crc kubenswrapper[4693]: [+]poststarthook/max-in-flight-filter ok Nov 25 12:10:32 crc kubenswrapper[4693]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 25 12:10:32 crc kubenswrapper[4693]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 25 12:10:32 crc kubenswrapper[4693]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 25 12:10:32 crc kubenswrapper[4693]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Nov 25 12:10:32 crc kubenswrapper[4693]: [+]poststarthook/project.openshift.io-projectcache ok Nov 25 12:10:32 crc kubenswrapper[4693]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 25 12:10:32 crc kubenswrapper[4693]: [+]poststarthook/openshift.io-startinformers ok Nov 25 12:10:32 crc kubenswrapper[4693]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 25 12:10:32 crc kubenswrapper[4693]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 25 12:10:32 crc kubenswrapper[4693]: livez check failed Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.370456 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" podUID="75594753-ed30-49c9-b2ee-b63e64782ab3" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.384901 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfz7v"] Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.476991 4693 generic.go:334] "Generic (PLEG): container finished" podID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" containerID="74724cb4c815acfead0d3d39fe2a51ca547c120731a6a33f3fb34476715567fe" exitCode=0 Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.477154 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwsql" event={"ID":"c89528b4-89fc-4cc4-a2a9-b70683bbbf51","Type":"ContainerDied","Data":"74724cb4c815acfead0d3d39fe2a51ca547c120731a6a33f3fb34476715567fe"} Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.479527 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfz7v" event={"ID":"631e327a-21ca-4eb7-ab17-dd80766e4055","Type":"ContainerStarted","Data":"162848c0d9a82c5de924d123684d0905633b39ae041166bf169a4e14b61c8fa7"} Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.485172 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" event={"ID":"5272f00f-cfb7-49dc-860c-50ec9ee0bd32","Type":"ContainerStarted","Data":"2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd"} Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.485217 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" event={"ID":"5272f00f-cfb7-49dc-860c-50ec9ee0bd32","Type":"ContainerStarted","Data":"7280994f7fd51e3c049b5cec16a6229398ca11f816baec84603e9cf67b550efd"} Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.489416 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.491428 4693 patch_prober.go:28] interesting pod/console-f9d7485db-4b2tf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.491463 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4b2tf" podUID="725c1b7d-81c5-4bbe-99b1-c53b93754feb" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.491686 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.498572 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6bd1119f-152e-4667-bf56-5c22052b29b0","Type":"ContainerStarted","Data":"8803f8124460f856da1406ea8b28e4433357c25add3abb1f7972eedb139e42e0"} Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.498633 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6bd1119f-152e-4667-bf56-5c22052b29b0","Type":"ContainerStarted","Data":"dec8d7cbb0dfdee60b0871657b6474a5af70b3fbdcb0e4bead33072f69715cb1"} Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.502407 4693 generic.go:334] "Generic (PLEG): container finished" podID="438f718d-ef67-42c1-b624-7d69c5e6b13f" containerID="fc09e76ed5002e18747cc585fb18fd30f052755db3f531c3ab13577b124cbc10" exitCode=0 Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.502487 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" event={"ID":"438f718d-ef67-42c1-b624-7d69c5e6b13f","Type":"ContainerDied","Data":"fc09e76ed5002e18747cc585fb18fd30f052755db3f531c3ab13577b124cbc10"} Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.505962 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.511324 4693 generic.go:334] "Generic (PLEG): container finished" podID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" containerID="635963849eb7738e978866066fc70b497981265b1fe2069a08723aaa5c1f2bd3" exitCode=0 Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.511460 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7mtn" event={"ID":"a9dd938a-f1c9-48f8-96ef-470405bfa9e4","Type":"ContainerDied","Data":"635963849eb7738e978866066fc70b497981265b1fe2069a08723aaa5c1f2bd3"} Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.513701 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.526078 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.526057561 podStartE2EDuration="2.526057561s" podCreationTimestamp="2025-11-25 12:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:32.523175642 +0000 UTC m=+152.441261013" watchObservedRunningTime="2025-11-25 12:10:32.526057561 +0000 UTC m=+152.444142942" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.587320 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z99mj"] Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.588637 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.591288 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.591470 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2w4j"] Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.602157 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z99mj"] Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.655471 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bb8880-6561-4ce4-8ca9-847eda1571c1-catalog-content\") pod \"redhat-operators-z99mj\" (UID: \"10bb8880-6561-4ce4-8ca9-847eda1571c1\") " pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.655565 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjt58\" (UniqueName: \"kubernetes.io/projected/10bb8880-6561-4ce4-8ca9-847eda1571c1-kube-api-access-mjt58\") pod \"redhat-operators-z99mj\" (UID: \"10bb8880-6561-4ce4-8ca9-847eda1571c1\") " pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.655590 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bb8880-6561-4ce4-8ca9-847eda1571c1-utilities\") pod \"redhat-operators-z99mj\" (UID: \"10bb8880-6561-4ce4-8ca9-847eda1571c1\") " pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.712520 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.729534 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-v5rkj" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.744525 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.744571 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.744696 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.744742 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.757634 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bb8880-6561-4ce4-8ca9-847eda1571c1-catalog-content\") pod \"redhat-operators-z99mj\" (UID: \"10bb8880-6561-4ce4-8ca9-847eda1571c1\") " pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.757734 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjt58\" (UniqueName: \"kubernetes.io/projected/10bb8880-6561-4ce4-8ca9-847eda1571c1-kube-api-access-mjt58\") pod \"redhat-operators-z99mj\" (UID: \"10bb8880-6561-4ce4-8ca9-847eda1571c1\") " pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.757754 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bb8880-6561-4ce4-8ca9-847eda1571c1-utilities\") pod \"redhat-operators-z99mj\" (UID: \"10bb8880-6561-4ce4-8ca9-847eda1571c1\") " pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.758211 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bb8880-6561-4ce4-8ca9-847eda1571c1-utilities\") pod \"redhat-operators-z99mj\" (UID: \"10bb8880-6561-4ce4-8ca9-847eda1571c1\") " pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.759957 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bb8880-6561-4ce4-8ca9-847eda1571c1-catalog-content\") pod \"redhat-operators-z99mj\" (UID: \"10bb8880-6561-4ce4-8ca9-847eda1571c1\") " pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.781344 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjt58\" (UniqueName: \"kubernetes.io/projected/10bb8880-6561-4ce4-8ca9-847eda1571c1-kube-api-access-mjt58\") pod \"redhat-operators-z99mj\" (UID: \"10bb8880-6561-4ce4-8ca9-847eda1571c1\") " pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.829874 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.841338 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-xtxw4" Nov 25 12:10:32 crc kubenswrapper[4693]: I1125 12:10:32.922382 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:32.992334 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kmwwr"] Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.000298 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kmwwr"] Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.000475 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.001212 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.007497 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:33 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:33 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:33 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.007556 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.063356 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxbcg\" (UniqueName: \"kubernetes.io/projected/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-kube-api-access-mxbcg\") pod \"redhat-operators-kmwwr\" (UID: \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\") " pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.063758 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-utilities\") pod \"redhat-operators-kmwwr\" (UID: \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\") " pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.063812 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-catalog-content\") pod \"redhat-operators-kmwwr\" (UID: \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\") " pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.156966 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z99mj"] Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.170648 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-catalog-content\") pod \"redhat-operators-kmwwr\" (UID: \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\") " pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.171830 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxbcg\" (UniqueName: \"kubernetes.io/projected/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-kube-api-access-mxbcg\") pod \"redhat-operators-kmwwr\" (UID: \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\") " pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.171909 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-utilities\") pod \"redhat-operators-kmwwr\" (UID: \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\") " pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.171924 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-catalog-content\") pod \"redhat-operators-kmwwr\" (UID: \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\") " pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.172992 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-utilities\") pod \"redhat-operators-kmwwr\" (UID: \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\") " pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.191682 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxbcg\" (UniqueName: \"kubernetes.io/projected/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-kube-api-access-mxbcg\") pod \"redhat-operators-kmwwr\" (UID: \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\") " pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.255909 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.320811 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.537075 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kmwwr"] Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.539168 4693 generic.go:334] "Generic (PLEG): container finished" podID="631e327a-21ca-4eb7-ab17-dd80766e4055" containerID="ebd0b0f6b4b71f1292dca1b6ed07e54f700c6d9327e5fbb8141ec07fd356fb0a" exitCode=0 Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.539240 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfz7v" event={"ID":"631e327a-21ca-4eb7-ab17-dd80766e4055","Type":"ContainerDied","Data":"ebd0b0f6b4b71f1292dca1b6ed07e54f700c6d9327e5fbb8141ec07fd356fb0a"} Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.551618 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z99mj" event={"ID":"10bb8880-6561-4ce4-8ca9-847eda1571c1","Type":"ContainerDied","Data":"1f9395f498c6ab7e4bcb26e97e2749f8a7d4a769d45b9f506cb46f5a504a9013"} Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.551990 4693 generic.go:334] "Generic (PLEG): container finished" podID="10bb8880-6561-4ce4-8ca9-847eda1571c1" containerID="1f9395f498c6ab7e4bcb26e97e2749f8a7d4a769d45b9f506cb46f5a504a9013" exitCode=0 Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.552068 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z99mj" event={"ID":"10bb8880-6561-4ce4-8ca9-847eda1571c1","Type":"ContainerStarted","Data":"be6f21cc91ad834bfe08ac32ffb9e21b6c44213afb1364ce6f90ce3ae6d0e8cb"} Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.555497 4693 generic.go:334] "Generic (PLEG): container finished" podID="6bd1119f-152e-4667-bf56-5c22052b29b0" containerID="8803f8124460f856da1406ea8b28e4433357c25add3abb1f7972eedb139e42e0" exitCode=0 Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.555569 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6bd1119f-152e-4667-bf56-5c22052b29b0","Type":"ContainerDied","Data":"8803f8124460f856da1406ea8b28e4433357c25add3abb1f7972eedb139e42e0"} Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.562915 4693 generic.go:334] "Generic (PLEG): container finished" podID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" containerID="ce692a694a282a70ad075fd4754de22191df3d5a47c2a96e315b8f2e23b89218" exitCode=0 Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.563951 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w4j" event={"ID":"6b2c46bf-6877-4fe0-a0af-38c7784f7536","Type":"ContainerDied","Data":"ce692a694a282a70ad075fd4754de22191df3d5a47c2a96e315b8f2e23b89218"} Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.563991 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w4j" event={"ID":"6b2c46bf-6877-4fe0-a0af-38c7784f7536","Type":"ContainerStarted","Data":"2597a3af106ade651d9fed92d7e55921f567399ebce66c73b9c2f1086c583825"} Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.565203 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.598759 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" podStartSLOduration=128.598738034 podStartE2EDuration="2m8.598738034s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:33.595954757 +0000 UTC m=+153.514040148" watchObservedRunningTime="2025-11-25 12:10:33.598738034 +0000 UTC m=+153.516823415" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.861599 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.986754 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/438f718d-ef67-42c1-b624-7d69c5e6b13f-config-volume\") pod \"438f718d-ef67-42c1-b624-7d69c5e6b13f\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.986993 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/438f718d-ef67-42c1-b624-7d69c5e6b13f-secret-volume\") pod \"438f718d-ef67-42c1-b624-7d69c5e6b13f\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.987031 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tn76\" (UniqueName: \"kubernetes.io/projected/438f718d-ef67-42c1-b624-7d69c5e6b13f-kube-api-access-2tn76\") pod \"438f718d-ef67-42c1-b624-7d69c5e6b13f\" (UID: \"438f718d-ef67-42c1-b624-7d69c5e6b13f\") " Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.987649 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/438f718d-ef67-42c1-b624-7d69c5e6b13f-config-volume" (OuterVolumeSpecName: "config-volume") pod "438f718d-ef67-42c1-b624-7d69c5e6b13f" (UID: "438f718d-ef67-42c1-b624-7d69c5e6b13f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:10:33 crc kubenswrapper[4693]: I1125 12:10:33.992917 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438f718d-ef67-42c1-b624-7d69c5e6b13f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "438f718d-ef67-42c1-b624-7d69c5e6b13f" (UID: "438f718d-ef67-42c1-b624-7d69c5e6b13f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.001431 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438f718d-ef67-42c1-b624-7d69c5e6b13f-kube-api-access-2tn76" (OuterVolumeSpecName: "kube-api-access-2tn76") pod "438f718d-ef67-42c1-b624-7d69c5e6b13f" (UID: "438f718d-ef67-42c1-b624-7d69c5e6b13f"). InnerVolumeSpecName "kube-api-access-2tn76". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.004401 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:34 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:34 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:34 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.004481 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.088576 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tn76\" (UniqueName: \"kubernetes.io/projected/438f718d-ef67-42c1-b624-7d69c5e6b13f-kube-api-access-2tn76\") on node \"crc\" DevicePath \"\"" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.088612 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/438f718d-ef67-42c1-b624-7d69c5e6b13f-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.088621 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/438f718d-ef67-42c1-b624-7d69c5e6b13f-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.335393 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 12:10:34 crc kubenswrapper[4693]: E1125 12:10:34.335823 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438f718d-ef67-42c1-b624-7d69c5e6b13f" containerName="collect-profiles" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.335848 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="438f718d-ef67-42c1-b624-7d69c5e6b13f" containerName="collect-profiles" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.336026 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="438f718d-ef67-42c1-b624-7d69c5e6b13f" containerName="collect-profiles" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.337217 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.339066 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.341146 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.344095 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.406387 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b80a3f5-cc69-4710-b09a-233617258302-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6b80a3f5-cc69-4710-b09a-233617258302\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.406598 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b80a3f5-cc69-4710-b09a-233617258302-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6b80a3f5-cc69-4710-b09a-233617258302\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.509429 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b80a3f5-cc69-4710-b09a-233617258302-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6b80a3f5-cc69-4710-b09a-233617258302\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.509502 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b80a3f5-cc69-4710-b09a-233617258302-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6b80a3f5-cc69-4710-b09a-233617258302\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.510421 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b80a3f5-cc69-4710-b09a-233617258302-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6b80a3f5-cc69-4710-b09a-233617258302\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.533652 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b80a3f5-cc69-4710-b09a-233617258302-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6b80a3f5-cc69-4710-b09a-233617258302\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.574731 4693 generic.go:334] "Generic (PLEG): container finished" podID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" containerID="4072b84e29d82be0d202c4a40f089170c80ccaf86d2f1cdc72760003ea75958b" exitCode=0 Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.574804 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmwwr" event={"ID":"a16709b3-1b76-4143-b2e5-c3f84bd9b63b","Type":"ContainerDied","Data":"4072b84e29d82be0d202c4a40f089170c80ccaf86d2f1cdc72760003ea75958b"} Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.574849 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmwwr" event={"ID":"a16709b3-1b76-4143-b2e5-c3f84bd9b63b","Type":"ContainerStarted","Data":"1b89c0a4c9de46dfc60157f31357cf7de5a0cd1919cc078b1c14dcf0db1a35d0"} Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.590010 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" event={"ID":"438f718d-ef67-42c1-b624-7d69c5e6b13f","Type":"ContainerDied","Data":"2ba66cee4d508a6bceabb73b4520cc02861b78a92c739de64884f39bd484ee95"} Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.590077 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ba66cee4d508a6bceabb73b4520cc02861b78a92c739de64884f39bd484ee95" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.590415 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.664116 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.917824 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 12:10:34 crc kubenswrapper[4693]: I1125 12:10:34.954971 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.003403 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:35 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:35 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:35 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.003472 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:35 crc kubenswrapper[4693]: W1125 12:10:35.009801 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6b80a3f5_cc69_4710_b09a_233617258302.slice/crio-2a1370fb8659a35fbac858ae611e58e8f6d16457892b28d3519b81d857ba07a2 WatchSource:0}: Error finding container 2a1370fb8659a35fbac858ae611e58e8f6d16457892b28d3519b81d857ba07a2: Status 404 returned error can't find the container with id 2a1370fb8659a35fbac858ae611e58e8f6d16457892b28d3519b81d857ba07a2 Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.020303 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bd1119f-152e-4667-bf56-5c22052b29b0-kube-api-access\") pod \"6bd1119f-152e-4667-bf56-5c22052b29b0\" (UID: \"6bd1119f-152e-4667-bf56-5c22052b29b0\") " Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.020414 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bd1119f-152e-4667-bf56-5c22052b29b0-kubelet-dir\") pod \"6bd1119f-152e-4667-bf56-5c22052b29b0\" (UID: \"6bd1119f-152e-4667-bf56-5c22052b29b0\") " Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.020457 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bd1119f-152e-4667-bf56-5c22052b29b0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6bd1119f-152e-4667-bf56-5c22052b29b0" (UID: "6bd1119f-152e-4667-bf56-5c22052b29b0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.020819 4693 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bd1119f-152e-4667-bf56-5c22052b29b0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.025243 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd1119f-152e-4667-bf56-5c22052b29b0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6bd1119f-152e-4667-bf56-5c22052b29b0" (UID: "6bd1119f-152e-4667-bf56-5c22052b29b0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.113866 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.114191 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.122242 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bd1119f-152e-4667-bf56-5c22052b29b0-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.604334 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6b80a3f5-cc69-4710-b09a-233617258302","Type":"ContainerStarted","Data":"2a1370fb8659a35fbac858ae611e58e8f6d16457892b28d3519b81d857ba07a2"} Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.607667 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.607733 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"6bd1119f-152e-4667-bf56-5c22052b29b0","Type":"ContainerDied","Data":"dec8d7cbb0dfdee60b0871657b6474a5af70b3fbdcb0e4bead33072f69715cb1"} Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.607895 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dec8d7cbb0dfdee60b0871657b6474a5af70b3fbdcb0e4bead33072f69715cb1" Nov 25 12:10:35 crc kubenswrapper[4693]: I1125 12:10:35.704842 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jtvmf" Nov 25 12:10:36 crc kubenswrapper[4693]: I1125 12:10:36.003097 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:36 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:36 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:36 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:36 crc kubenswrapper[4693]: I1125 12:10:36.003161 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:36 crc kubenswrapper[4693]: I1125 12:10:36.615931 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6b80a3f5-cc69-4710-b09a-233617258302","Type":"ContainerStarted","Data":"cce8c27f1e5d2f1dcf732e7a73853e6e5a091a70e88a6718abd530f5fce6c72f"} Nov 25 12:10:37 crc kubenswrapper[4693]: I1125 12:10:37.003304 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:37 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:37 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:37 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:37 crc kubenswrapper[4693]: I1125 12:10:37.003356 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:37 crc kubenswrapper[4693]: I1125 12:10:37.376800 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:37 crc kubenswrapper[4693]: I1125 12:10:37.380803 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-c4h8g" Nov 25 12:10:37 crc kubenswrapper[4693]: I1125 12:10:37.399481 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.39946187 podStartE2EDuration="3.39946187s" podCreationTimestamp="2025-11-25 12:10:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:10:36.641783062 +0000 UTC m=+156.559868443" watchObservedRunningTime="2025-11-25 12:10:37.39946187 +0000 UTC m=+157.317547271" Nov 25 12:10:37 crc kubenswrapper[4693]: I1125 12:10:37.626191 4693 generic.go:334] "Generic (PLEG): container finished" podID="6b80a3f5-cc69-4710-b09a-233617258302" containerID="cce8c27f1e5d2f1dcf732e7a73853e6e5a091a70e88a6718abd530f5fce6c72f" exitCode=0 Nov 25 12:10:37 crc kubenswrapper[4693]: I1125 12:10:37.626864 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6b80a3f5-cc69-4710-b09a-233617258302","Type":"ContainerDied","Data":"cce8c27f1e5d2f1dcf732e7a73853e6e5a091a70e88a6718abd530f5fce6c72f"} Nov 25 12:10:38 crc kubenswrapper[4693]: I1125 12:10:38.003315 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:38 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:38 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:38 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:38 crc kubenswrapper[4693]: I1125 12:10:38.003406 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:39 crc kubenswrapper[4693]: I1125 12:10:39.003115 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:39 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:39 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:39 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:39 crc kubenswrapper[4693]: I1125 12:10:39.003503 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:40 crc kubenswrapper[4693]: I1125 12:10:40.003218 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:40 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:40 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:40 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:40 crc kubenswrapper[4693]: I1125 12:10:40.003592 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:41 crc kubenswrapper[4693]: I1125 12:10:41.002676 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:41 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:41 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:41 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:41 crc kubenswrapper[4693]: I1125 12:10:41.002723 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:42 crc kubenswrapper[4693]: I1125 12:10:42.002536 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:42 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:42 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:42 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:42 crc kubenswrapper[4693]: I1125 12:10:42.002613 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:42 crc kubenswrapper[4693]: I1125 12:10:42.486634 4693 patch_prober.go:28] interesting pod/console-f9d7485db-4b2tf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Nov 25 12:10:42 crc kubenswrapper[4693]: I1125 12:10:42.487003 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4b2tf" podUID="725c1b7d-81c5-4bbe-99b1-c53b93754feb" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Nov 25 12:10:42 crc kubenswrapper[4693]: I1125 12:10:42.744134 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:10:42 crc kubenswrapper[4693]: I1125 12:10:42.744204 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:10:42 crc kubenswrapper[4693]: I1125 12:10:42.745541 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:10:42 crc kubenswrapper[4693]: I1125 12:10:42.745601 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:10:43 crc kubenswrapper[4693]: I1125 12:10:43.004220 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:43 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:43 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:43 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:43 crc kubenswrapper[4693]: I1125 12:10:43.004285 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:43 crc kubenswrapper[4693]: I1125 12:10:43.840230 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 12:10:43 crc kubenswrapper[4693]: I1125 12:10:43.867111 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b80a3f5-cc69-4710-b09a-233617258302-kube-api-access\") pod \"6b80a3f5-cc69-4710-b09a-233617258302\" (UID: \"6b80a3f5-cc69-4710-b09a-233617258302\") " Nov 25 12:10:43 crc kubenswrapper[4693]: I1125 12:10:43.867195 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b80a3f5-cc69-4710-b09a-233617258302-kubelet-dir\") pod \"6b80a3f5-cc69-4710-b09a-233617258302\" (UID: \"6b80a3f5-cc69-4710-b09a-233617258302\") " Nov 25 12:10:43 crc kubenswrapper[4693]: I1125 12:10:43.867484 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b80a3f5-cc69-4710-b09a-233617258302-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6b80a3f5-cc69-4710-b09a-233617258302" (UID: "6b80a3f5-cc69-4710-b09a-233617258302"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:10:43 crc kubenswrapper[4693]: I1125 12:10:43.871829 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b80a3f5-cc69-4710-b09a-233617258302-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6b80a3f5-cc69-4710-b09a-233617258302" (UID: "6b80a3f5-cc69-4710-b09a-233617258302"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:10:43 crc kubenswrapper[4693]: I1125 12:10:43.969257 4693 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b80a3f5-cc69-4710-b09a-233617258302-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 12:10:43 crc kubenswrapper[4693]: I1125 12:10:43.969296 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b80a3f5-cc69-4710-b09a-233617258302-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 12:10:44 crc kubenswrapper[4693]: I1125 12:10:44.004269 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:44 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:44 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:44 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:44 crc kubenswrapper[4693]: I1125 12:10:44.004345 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:44 crc kubenswrapper[4693]: I1125 12:10:44.687831 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6b80a3f5-cc69-4710-b09a-233617258302","Type":"ContainerDied","Data":"2a1370fb8659a35fbac858ae611e58e8f6d16457892b28d3519b81d857ba07a2"} Nov 25 12:10:44 crc kubenswrapper[4693]: I1125 12:10:44.687900 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a1370fb8659a35fbac858ae611e58e8f6d16457892b28d3519b81d857ba07a2" Nov 25 12:10:44 crc kubenswrapper[4693]: I1125 12:10:44.688006 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 25 12:10:45 crc kubenswrapper[4693]: I1125 12:10:45.005014 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:45 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:45 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:45 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:45 crc kubenswrapper[4693]: I1125 12:10:45.005094 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:46 crc kubenswrapper[4693]: I1125 12:10:46.003988 4693 patch_prober.go:28] interesting pod/router-default-5444994796-8r9bv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 25 12:10:46 crc kubenswrapper[4693]: [-]has-synced failed: reason withheld Nov 25 12:10:46 crc kubenswrapper[4693]: [+]process-running ok Nov 25 12:10:46 crc kubenswrapper[4693]: healthz check failed Nov 25 12:10:46 crc kubenswrapper[4693]: I1125 12:10:46.004060 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8r9bv" podUID="b9acbedc-c0ad-4862-b5f0-05adb69d9bde" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 25 12:10:47 crc kubenswrapper[4693]: I1125 12:10:47.005945 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:47 crc kubenswrapper[4693]: I1125 12:10:47.008845 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8r9bv" Nov 25 12:10:48 crc kubenswrapper[4693]: I1125 12:10:48.246742 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:48 crc kubenswrapper[4693]: I1125 12:10:48.252297 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a10eb19c-b500-4cf9-961d-1892ba67560a-metrics-certs\") pod \"network-metrics-daemon-n2f89\" (UID: \"a10eb19c-b500-4cf9-961d-1892ba67560a\") " pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:48 crc kubenswrapper[4693]: I1125 12:10:48.436290 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2f89" Nov 25 12:10:51 crc kubenswrapper[4693]: I1125 12:10:51.380224 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:10:52 crc kubenswrapper[4693]: I1125 12:10:52.512938 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:52 crc kubenswrapper[4693]: I1125 12:10:52.519123 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:10:52 crc kubenswrapper[4693]: I1125 12:10:52.745277 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:10:52 crc kubenswrapper[4693]: I1125 12:10:52.745334 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:10:52 crc kubenswrapper[4693]: I1125 12:10:52.745396 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-skxbw" Nov 25 12:10:52 crc kubenswrapper[4693]: I1125 12:10:52.745764 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"e0bce37eb650ea3ad7bb4b511c8e38dae1f62677f6b0c7632dcd0b0d5df82f33"} pod="openshift-console/downloads-7954f5f757-skxbw" containerMessage="Container download-server failed liveness probe, will be restarted" Nov 25 12:10:52 crc kubenswrapper[4693]: I1125 12:10:52.745851 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" containerID="cri-o://e0bce37eb650ea3ad7bb4b511c8e38dae1f62677f6b0c7632dcd0b0d5df82f33" gracePeriod=2 Nov 25 12:10:52 crc kubenswrapper[4693]: I1125 12:10:52.746179 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:10:52 crc kubenswrapper[4693]: I1125 12:10:52.746477 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:10:52 crc kubenswrapper[4693]: I1125 12:10:52.748860 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:10:52 crc kubenswrapper[4693]: I1125 12:10:52.748894 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:10:53 crc kubenswrapper[4693]: I1125 12:10:53.742570 4693 generic.go:334] "Generic (PLEG): container finished" podID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerID="e0bce37eb650ea3ad7bb4b511c8e38dae1f62677f6b0c7632dcd0b0d5df82f33" exitCode=0 Nov 25 12:10:53 crc kubenswrapper[4693]: I1125 12:10:53.742684 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-skxbw" event={"ID":"a9e1e257-9a52-475a-a5ec-cd6fa9449f24","Type":"ContainerDied","Data":"e0bce37eb650ea3ad7bb4b511c8e38dae1f62677f6b0c7632dcd0b0d5df82f33"} Nov 25 12:11:01 crc kubenswrapper[4693]: E1125 12:11:01.476035 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 25 12:11:01 crc kubenswrapper[4693]: E1125 12:11:01.476582 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4mwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dv9rw_openshift-marketplace(9fa13974-bf19-4195-af55-6dec3741828d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 12:11:01 crc kubenswrapper[4693]: E1125 12:11:01.477939 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dv9rw" podUID="9fa13974-bf19-4195-af55-6dec3741828d" Nov 25 12:11:02 crc kubenswrapper[4693]: I1125 12:11:02.745633 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:11:02 crc kubenswrapper[4693]: I1125 12:11:02.745912 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:11:02 crc kubenswrapper[4693]: I1125 12:11:02.967255 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-h6fkp" Nov 25 12:11:05 crc kubenswrapper[4693]: I1125 12:11:05.113817 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:11:05 crc kubenswrapper[4693]: I1125 12:11:05.113887 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:11:08 crc kubenswrapper[4693]: E1125 12:11:08.251659 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dv9rw" podUID="9fa13974-bf19-4195-af55-6dec3741828d" Nov 25 12:11:08 crc kubenswrapper[4693]: I1125 12:11:08.860283 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 25 12:11:12 crc kubenswrapper[4693]: I1125 12:11:12.745287 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:11:12 crc kubenswrapper[4693]: I1125 12:11:12.745337 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:11:13 crc kubenswrapper[4693]: E1125 12:11:13.607220 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 25 12:11:13 crc kubenswrapper[4693]: E1125 12:11:13.607713 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kjfhh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sfz7v_openshift-marketplace(631e327a-21ca-4eb7-ab17-dd80766e4055): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 12:11:13 crc kubenswrapper[4693]: E1125 12:11:13.609135 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sfz7v" podUID="631e327a-21ca-4eb7-ab17-dd80766e4055" Nov 25 12:11:18 crc kubenswrapper[4693]: E1125 12:11:18.521696 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sfz7v" podUID="631e327a-21ca-4eb7-ab17-dd80766e4055" Nov 25 12:11:22 crc kubenswrapper[4693]: I1125 12:11:22.744175 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:11:22 crc kubenswrapper[4693]: I1125 12:11:22.745092 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:11:23 crc kubenswrapper[4693]: E1125 12:11:23.622219 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 25 12:11:23 crc kubenswrapper[4693]: E1125 12:11:23.622471 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdwfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kwsql_openshift-marketplace(c89528b4-89fc-4cc4-a2a9-b70683bbbf51): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 12:11:23 crc kubenswrapper[4693]: E1125 12:11:23.624305 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kwsql" podUID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" Nov 25 12:11:23 crc kubenswrapper[4693]: E1125 12:11:23.652710 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 25 12:11:23 crc kubenswrapper[4693]: E1125 12:11:23.652864 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mjt58,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-z99mj_openshift-marketplace(10bb8880-6561-4ce4-8ca9-847eda1571c1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 12:11:23 crc kubenswrapper[4693]: E1125 12:11:23.654122 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-z99mj" podUID="10bb8880-6561-4ce4-8ca9-847eda1571c1" Nov 25 12:11:32 crc kubenswrapper[4693]: I1125 12:11:32.743955 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:11:32 crc kubenswrapper[4693]: I1125 12:11:32.746594 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:11:35 crc kubenswrapper[4693]: I1125 12:11:35.113824 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:11:35 crc kubenswrapper[4693]: I1125 12:11:35.114221 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:11:35 crc kubenswrapper[4693]: I1125 12:11:35.114292 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:11:35 crc kubenswrapper[4693]: I1125 12:11:35.115017 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 12:11:35 crc kubenswrapper[4693]: I1125 12:11:35.115117 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b" gracePeriod=600 Nov 25 12:11:37 crc kubenswrapper[4693]: E1125 12:11:37.002327 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 25 12:11:37 crc kubenswrapper[4693]: E1125 12:11:37.002490 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mxbcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-kmwwr_openshift-marketplace(a16709b3-1b76-4143-b2e5-c3f84bd9b63b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 12:11:37 crc kubenswrapper[4693]: E1125 12:11:37.003764 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-kmwwr" podUID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" Nov 25 12:11:40 crc kubenswrapper[4693]: E1125 12:11:40.184202 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-kmwwr" podUID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" Nov 25 12:11:40 crc kubenswrapper[4693]: I1125 12:11:40.465494 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n2f89"] Nov 25 12:11:41 crc kubenswrapper[4693]: I1125 12:11:41.054704 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n2f89" event={"ID":"a10eb19c-b500-4cf9-961d-1892ba67560a","Type":"ContainerStarted","Data":"8f0aed5af5024046ac1225edbf85be7de6411a8d4a3f6f20d0c408d5d6e6b0ef"} Nov 25 12:11:42 crc kubenswrapper[4693]: I1125 12:11:42.064282 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b" exitCode=0 Nov 25 12:11:42 crc kubenswrapper[4693]: I1125 12:11:42.064338 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b"} Nov 25 12:11:42 crc kubenswrapper[4693]: I1125 12:11:42.745737 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:11:42 crc kubenswrapper[4693]: I1125 12:11:42.745820 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:11:45 crc kubenswrapper[4693]: E1125 12:11:45.248160 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 25 12:11:45 crc kubenswrapper[4693]: E1125 12:11:45.248769 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6675p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-p2w4j_openshift-marketplace(6b2c46bf-6877-4fe0-a0af-38c7784f7536): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 12:11:45 crc kubenswrapper[4693]: E1125 12:11:45.250063 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-p2w4j" podUID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" Nov 25 12:11:45 crc kubenswrapper[4693]: E1125 12:11:45.294630 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 25 12:11:45 crc kubenswrapper[4693]: E1125 12:11:45.294809 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzbph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gdgjj_openshift-marketplace(dd350aee-2f89-4b9f-ad62-454d6376a81f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 12:11:45 crc kubenswrapper[4693]: E1125 12:11:45.296037 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gdgjj" podUID="dd350aee-2f89-4b9f-ad62-454d6376a81f" Nov 25 12:11:46 crc kubenswrapper[4693]: I1125 12:11:46.088933 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n2f89" event={"ID":"a10eb19c-b500-4cf9-961d-1892ba67560a","Type":"ContainerStarted","Data":"a84c999625d719a1adbf447e134808eef84822449671a1a51b305746c12c7196"} Nov 25 12:11:48 crc kubenswrapper[4693]: E1125 12:11:48.380835 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 25 12:11:48 crc kubenswrapper[4693]: E1125 12:11:48.382174 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-826z7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-b7mtn_openshift-marketplace(a9dd938a-f1c9-48f8-96ef-470405bfa9e4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 25 12:11:48 crc kubenswrapper[4693]: E1125 12:11:48.383700 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-b7mtn" podUID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" Nov 25 12:11:52 crc kubenswrapper[4693]: I1125 12:11:52.743624 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:11:52 crc kubenswrapper[4693]: I1125 12:11:52.743701 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:12:02 crc kubenswrapper[4693]: I1125 12:12:02.743541 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:12:02 crc kubenswrapper[4693]: I1125 12:12:02.744165 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.239054 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n2f89" event={"ID":"a10eb19c-b500-4cf9-961d-1892ba67560a","Type":"ContainerStarted","Data":"531029c43fbbba482c162570f9f9c1cd39c1098ef64a897fce59494524e4716c"} Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.240585 4693 generic.go:334] "Generic (PLEG): container finished" podID="10bb8880-6561-4ce4-8ca9-847eda1571c1" containerID="60b2eca352e1dc15ab7927e14d84d379887e587de13e2404c5b2b89614d8615d" exitCode=0 Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.240653 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z99mj" event={"ID":"10bb8880-6561-4ce4-8ca9-847eda1571c1","Type":"ContainerDied","Data":"60b2eca352e1dc15ab7927e14d84d379887e587de13e2404c5b2b89614d8615d"} Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.242884 4693 generic.go:334] "Generic (PLEG): container finished" podID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" containerID="be653373e81654c693c5b12f7a8ad0b81d76e92fbef7da575587922ab86b294c" exitCode=0 Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.242953 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwsql" event={"ID":"c89528b4-89fc-4cc4-a2a9-b70683bbbf51","Type":"ContainerDied","Data":"be653373e81654c693c5b12f7a8ad0b81d76e92fbef7da575587922ab86b294c"} Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.245909 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"b88ee2add6c7828542a3cee62632b97ee1acd6379863900fa881c0767075ca70"} Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.249206 4693 generic.go:334] "Generic (PLEG): container finished" podID="9fa13974-bf19-4195-af55-6dec3741828d" containerID="09acacf774eb40f876c81514f2a2c73b7f0bbf54135a795af95c079b5296f745" exitCode=0 Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.249269 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9rw" event={"ID":"9fa13974-bf19-4195-af55-6dec3741828d","Type":"ContainerDied","Data":"09acacf774eb40f876c81514f2a2c73b7f0bbf54135a795af95c079b5296f745"} Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.263777 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-skxbw" event={"ID":"a9e1e257-9a52-475a-a5ec-cd6fa9449f24","Type":"ContainerStarted","Data":"60b0a9f662e29413bbbdbb0465708bd9b07b692d1b0cd70d6c5c88bd031c80d9"} Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.264153 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-skxbw" Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.264220 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.264253 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.268206 4693 generic.go:334] "Generic (PLEG): container finished" podID="631e327a-21ca-4eb7-ab17-dd80766e4055" containerID="8e49a9b1ff37d4ed58a08eea31d7b131b185c71a1c2ed333c9996986f6f46182" exitCode=0 Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.268259 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfz7v" event={"ID":"631e327a-21ca-4eb7-ab17-dd80766e4055","Type":"ContainerDied","Data":"8e49a9b1ff37d4ed58a08eea31d7b131b185c71a1c2ed333c9996986f6f46182"} Nov 25 12:12:09 crc kubenswrapper[4693]: I1125 12:12:09.283924 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-n2f89" podStartSLOduration=224.28390146 podStartE2EDuration="3m44.28390146s" podCreationTimestamp="2025-11-25 12:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:12:09.256438613 +0000 UTC m=+249.174523994" watchObservedRunningTime="2025-11-25 12:12:09.28390146 +0000 UTC m=+249.201986861" Nov 25 12:12:10 crc kubenswrapper[4693]: I1125 12:12:10.275543 4693 generic.go:334] "Generic (PLEG): container finished" podID="dd350aee-2f89-4b9f-ad62-454d6376a81f" containerID="4c1e53ff95c329ce4c69d96603d92e247d4c48ec287086cdc6cdc6517730271c" exitCode=0 Nov 25 12:12:10 crc kubenswrapper[4693]: I1125 12:12:10.275637 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdgjj" event={"ID":"dd350aee-2f89-4b9f-ad62-454d6376a81f","Type":"ContainerDied","Data":"4c1e53ff95c329ce4c69d96603d92e247d4c48ec287086cdc6cdc6517730271c"} Nov 25 12:12:10 crc kubenswrapper[4693]: I1125 12:12:10.279541 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwsql" event={"ID":"c89528b4-89fc-4cc4-a2a9-b70683bbbf51","Type":"ContainerStarted","Data":"63656d5830464f8b4a5e093bf282210bd2972646454af673f5482b8a6e20d85d"} Nov 25 12:12:10 crc kubenswrapper[4693]: I1125 12:12:10.281400 4693 generic.go:334] "Generic (PLEG): container finished" podID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" containerID="894fdfc1e8713e1a1a057d97e6dd3c2ef7decbec96f45e7da12c90a44a9ddb10" exitCode=0 Nov 25 12:12:10 crc kubenswrapper[4693]: I1125 12:12:10.281426 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmwwr" event={"ID":"a16709b3-1b76-4143-b2e5-c3f84bd9b63b","Type":"ContainerDied","Data":"894fdfc1e8713e1a1a057d97e6dd3c2ef7decbec96f45e7da12c90a44a9ddb10"} Nov 25 12:12:10 crc kubenswrapper[4693]: I1125 12:12:10.283926 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfz7v" event={"ID":"631e327a-21ca-4eb7-ab17-dd80766e4055","Type":"ContainerStarted","Data":"b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e"} Nov 25 12:12:10 crc kubenswrapper[4693]: I1125 12:12:10.287656 4693 generic.go:334] "Generic (PLEG): container finished" podID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" containerID="964dedae3ef1124ba90ecdefe74acf740b4b59d940dd45fc82e5bbad1d886cce" exitCode=0 Nov 25 12:12:10 crc kubenswrapper[4693]: I1125 12:12:10.287742 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w4j" event={"ID":"6b2c46bf-6877-4fe0-a0af-38c7784f7536","Type":"ContainerDied","Data":"964dedae3ef1124ba90ecdefe74acf740b4b59d940dd45fc82e5bbad1d886cce"} Nov 25 12:12:10 crc kubenswrapper[4693]: I1125 12:12:10.295904 4693 generic.go:334] "Generic (PLEG): container finished" podID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" containerID="11e4ed98daa6e00bfacf4f3f4c918b4ca1a016e7eae0e7a406ca7031f732dfc7" exitCode=0 Nov 25 12:12:10 crc kubenswrapper[4693]: I1125 12:12:10.297203 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7mtn" event={"ID":"a9dd938a-f1c9-48f8-96ef-470405bfa9e4","Type":"ContainerDied","Data":"11e4ed98daa6e00bfacf4f3f4c918b4ca1a016e7eae0e7a406ca7031f732dfc7"} Nov 25 12:12:10 crc kubenswrapper[4693]: I1125 12:12:10.298171 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:12:10 crc kubenswrapper[4693]: I1125 12:12:10.298216 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:12:11 crc kubenswrapper[4693]: I1125 12:12:11.303708 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9rw" event={"ID":"9fa13974-bf19-4195-af55-6dec3741828d","Type":"ContainerStarted","Data":"e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4"} Nov 25 12:12:11 crc kubenswrapper[4693]: I1125 12:12:11.305766 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z99mj" event={"ID":"10bb8880-6561-4ce4-8ca9-847eda1571c1","Type":"ContainerStarted","Data":"307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6"} Nov 25 12:12:11 crc kubenswrapper[4693]: I1125 12:12:11.329453 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kwsql" podStartSLOduration=5.028089269 podStartE2EDuration="1m42.329428775s" podCreationTimestamp="2025-11-25 12:10:29 +0000 UTC" firstStartedPulling="2025-11-25 12:10:32.478444821 +0000 UTC m=+152.396530202" lastFinishedPulling="2025-11-25 12:12:09.779784327 +0000 UTC m=+249.697869708" observedRunningTime="2025-11-25 12:12:11.326483152 +0000 UTC m=+251.244568563" watchObservedRunningTime="2025-11-25 12:12:11.329428775 +0000 UTC m=+251.247514196" Nov 25 12:12:11 crc kubenswrapper[4693]: I1125 12:12:11.352623 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sfz7v" podStartSLOduration=3.970604569 podStartE2EDuration="1m40.352597783s" podCreationTimestamp="2025-11-25 12:10:31 +0000 UTC" firstStartedPulling="2025-11-25 12:10:33.542251298 +0000 UTC m=+153.460336679" lastFinishedPulling="2025-11-25 12:12:09.924244512 +0000 UTC m=+249.842329893" observedRunningTime="2025-11-25 12:12:11.348851752 +0000 UTC m=+251.266937133" watchObservedRunningTime="2025-11-25 12:12:11.352597783 +0000 UTC m=+251.270683204" Nov 25 12:12:12 crc kubenswrapper[4693]: I1125 12:12:12.032929 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:12:12 crc kubenswrapper[4693]: I1125 12:12:12.033271 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:12:12 crc kubenswrapper[4693]: I1125 12:12:12.336650 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dv9rw" podStartSLOduration=4.94985898 podStartE2EDuration="1m43.336625686s" podCreationTimestamp="2025-11-25 12:10:29 +0000 UTC" firstStartedPulling="2025-11-25 12:10:31.457240049 +0000 UTC m=+151.375325430" lastFinishedPulling="2025-11-25 12:12:09.844006755 +0000 UTC m=+249.762092136" observedRunningTime="2025-11-25 12:12:12.335295749 +0000 UTC m=+252.253381140" watchObservedRunningTime="2025-11-25 12:12:12.336625686 +0000 UTC m=+252.254711067" Nov 25 12:12:12 crc kubenswrapper[4693]: I1125 12:12:12.744154 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:12:12 crc kubenswrapper[4693]: I1125 12:12:12.744219 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:12:12 crc kubenswrapper[4693]: I1125 12:12:12.744159 4693 patch_prober.go:28] interesting pod/downloads-7954f5f757-skxbw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Nov 25 12:12:12 crc kubenswrapper[4693]: I1125 12:12:12.744342 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-skxbw" podUID="a9e1e257-9a52-475a-a5ec-cd6fa9449f24" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Nov 25 12:12:12 crc kubenswrapper[4693]: I1125 12:12:12.923628 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:12:12 crc kubenswrapper[4693]: I1125 12:12:12.923712 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:12:13 crc kubenswrapper[4693]: I1125 12:12:13.695869 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-sfz7v" podUID="631e327a-21ca-4eb7-ab17-dd80766e4055" containerName="registry-server" probeResult="failure" output=< Nov 25 12:12:13 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 12:12:13 crc kubenswrapper[4693]: > Nov 25 12:12:13 crc kubenswrapper[4693]: I1125 12:12:13.984597 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z99mj" podUID="10bb8880-6561-4ce4-8ca9-847eda1571c1" containerName="registry-server" probeResult="failure" output=< Nov 25 12:12:13 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 12:12:13 crc kubenswrapper[4693]: > Nov 25 12:12:16 crc kubenswrapper[4693]: I1125 12:12:16.331918 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdgjj" event={"ID":"dd350aee-2f89-4b9f-ad62-454d6376a81f","Type":"ContainerStarted","Data":"af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16"} Nov 25 12:12:17 crc kubenswrapper[4693]: I1125 12:12:17.356014 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z99mj" podStartSLOduration=8.921731969 podStartE2EDuration="1m45.355995585s" podCreationTimestamp="2025-11-25 12:10:32 +0000 UTC" firstStartedPulling="2025-11-25 12:10:33.552871799 +0000 UTC m=+153.470957180" lastFinishedPulling="2025-11-25 12:12:09.987135415 +0000 UTC m=+249.905220796" observedRunningTime="2025-11-25 12:12:12.372859469 +0000 UTC m=+252.290944860" watchObservedRunningTime="2025-11-25 12:12:17.355995585 +0000 UTC m=+257.274080986" Nov 25 12:12:17 crc kubenswrapper[4693]: I1125 12:12:17.358027 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gdgjj" podStartSLOduration=4.563753148 podStartE2EDuration="1m48.358013136s" podCreationTimestamp="2025-11-25 12:10:29 +0000 UTC" firstStartedPulling="2025-11-25 12:10:31.282399342 +0000 UTC m=+151.200484713" lastFinishedPulling="2025-11-25 12:12:15.07665928 +0000 UTC m=+254.994744701" observedRunningTime="2025-11-25 12:12:17.353980155 +0000 UTC m=+257.272065546" watchObservedRunningTime="2025-11-25 12:12:17.358013136 +0000 UTC m=+257.276098527" Nov 25 12:12:20 crc kubenswrapper[4693]: I1125 12:12:20.051803 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:12:20 crc kubenswrapper[4693]: I1125 12:12:20.052885 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:12:20 crc kubenswrapper[4693]: I1125 12:12:20.093827 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:12:20 crc kubenswrapper[4693]: I1125 12:12:20.093900 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:12:20 crc kubenswrapper[4693]: I1125 12:12:20.148154 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:12:20 crc kubenswrapper[4693]: I1125 12:12:20.152964 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:12:20 crc kubenswrapper[4693]: I1125 12:12:20.320214 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:12:20 crc kubenswrapper[4693]: I1125 12:12:20.320288 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:12:20 crc kubenswrapper[4693]: I1125 12:12:20.372456 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:12:20 crc kubenswrapper[4693]: I1125 12:12:20.411120 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:12:20 crc kubenswrapper[4693]: I1125 12:12:20.419332 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:12:21 crc kubenswrapper[4693]: I1125 12:12:21.388783 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kwsql"] Nov 25 12:12:22 crc kubenswrapper[4693]: I1125 12:12:22.094194 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:12:22 crc kubenswrapper[4693]: I1125 12:12:22.168989 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:12:22 crc kubenswrapper[4693]: I1125 12:12:22.368332 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kwsql" podUID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" containerName="registry-server" containerID="cri-o://63656d5830464f8b4a5e093bf282210bd2972646454af673f5482b8a6e20d85d" gracePeriod=2 Nov 25 12:12:22 crc kubenswrapper[4693]: I1125 12:12:22.771033 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-skxbw" Nov 25 12:12:22 crc kubenswrapper[4693]: I1125 12:12:22.985657 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:12:23 crc kubenswrapper[4693]: I1125 12:12:23.031696 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:12:24 crc kubenswrapper[4693]: I1125 12:12:24.383443 4693 generic.go:334] "Generic (PLEG): container finished" podID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" containerID="63656d5830464f8b4a5e093bf282210bd2972646454af673f5482b8a6e20d85d" exitCode=0 Nov 25 12:12:24 crc kubenswrapper[4693]: I1125 12:12:24.383491 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwsql" event={"ID":"c89528b4-89fc-4cc4-a2a9-b70683bbbf51","Type":"ContainerDied","Data":"63656d5830464f8b4a5e093bf282210bd2972646454af673f5482b8a6e20d85d"} Nov 25 12:12:25 crc kubenswrapper[4693]: I1125 12:12:25.393156 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7mtn" event={"ID":"a9dd938a-f1c9-48f8-96ef-470405bfa9e4","Type":"ContainerStarted","Data":"1ded64d93c5a19985b5316ed3a4c94151e7729fc682f86eed3b678029eb09d67"} Nov 25 12:12:25 crc kubenswrapper[4693]: I1125 12:12:25.729218 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:12:25 crc kubenswrapper[4693]: I1125 12:12:25.898628 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdwfm\" (UniqueName: \"kubernetes.io/projected/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-kube-api-access-fdwfm\") pod \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\" (UID: \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\") " Nov 25 12:12:25 crc kubenswrapper[4693]: I1125 12:12:25.898719 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-utilities\") pod \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\" (UID: \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\") " Nov 25 12:12:25 crc kubenswrapper[4693]: I1125 12:12:25.898791 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-catalog-content\") pod \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\" (UID: \"c89528b4-89fc-4cc4-a2a9-b70683bbbf51\") " Nov 25 12:12:25 crc kubenswrapper[4693]: I1125 12:12:25.900296 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-utilities" (OuterVolumeSpecName: "utilities") pod "c89528b4-89fc-4cc4-a2a9-b70683bbbf51" (UID: "c89528b4-89fc-4cc4-a2a9-b70683bbbf51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:12:25 crc kubenswrapper[4693]: I1125 12:12:25.907656 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-kube-api-access-fdwfm" (OuterVolumeSpecName: "kube-api-access-fdwfm") pod "c89528b4-89fc-4cc4-a2a9-b70683bbbf51" (UID: "c89528b4-89fc-4cc4-a2a9-b70683bbbf51"). InnerVolumeSpecName "kube-api-access-fdwfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:12:25 crc kubenswrapper[4693]: I1125 12:12:25.968923 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c89528b4-89fc-4cc4-a2a9-b70683bbbf51" (UID: "c89528b4-89fc-4cc4-a2a9-b70683bbbf51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.000991 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdwfm\" (UniqueName: \"kubernetes.io/projected/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-kube-api-access-fdwfm\") on node \"crc\" DevicePath \"\"" Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.001054 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.001077 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c89528b4-89fc-4cc4-a2a9-b70683bbbf51-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.401364 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwsql" event={"ID":"c89528b4-89fc-4cc4-a2a9-b70683bbbf51","Type":"ContainerDied","Data":"7683cd75c23691e16e0493d80ea048b259381064209e4f5353eea01d65a13036"} Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.401391 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwsql" Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.401446 4693 scope.go:117] "RemoveContainer" containerID="63656d5830464f8b4a5e093bf282210bd2972646454af673f5482b8a6e20d85d" Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.404569 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmwwr" event={"ID":"a16709b3-1b76-4143-b2e5-c3f84bd9b63b","Type":"ContainerStarted","Data":"1b1593b132a1ac4936510ddb465b9e6d820a397eeebf7eaa6c2ef6b5749201b3"} Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.409904 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w4j" event={"ID":"6b2c46bf-6877-4fe0-a0af-38c7784f7536","Type":"ContainerStarted","Data":"bc8373a95f0afe301448a386095f7b06dca925543e0345802c39f61755144da6"} Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.416436 4693 scope.go:117] "RemoveContainer" containerID="be653373e81654c693c5b12f7a8ad0b81d76e92fbef7da575587922ab86b294c" Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.438595 4693 scope.go:117] "RemoveContainer" containerID="74724cb4c815acfead0d3d39fe2a51ca547c120731a6a33f3fb34476715567fe" Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.441516 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kwsql"] Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.446335 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kwsql"] Nov 25 12:12:26 crc kubenswrapper[4693]: I1125 12:12:26.819505 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" path="/var/lib/kubelet/pods/c89528b4-89fc-4cc4-a2a9-b70683bbbf51/volumes" Nov 25 12:12:27 crc kubenswrapper[4693]: I1125 12:12:27.467994 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b7mtn" podStartSLOduration=9.542257372 podStartE2EDuration="1m58.467977936s" podCreationTimestamp="2025-11-25 12:10:29 +0000 UTC" firstStartedPulling="2025-11-25 12:10:32.522883053 +0000 UTC m=+152.440968434" lastFinishedPulling="2025-11-25 12:12:21.448603607 +0000 UTC m=+261.366688998" observedRunningTime="2025-11-25 12:12:27.443000305 +0000 UTC m=+267.361085686" watchObservedRunningTime="2025-11-25 12:12:27.467977936 +0000 UTC m=+267.386063317" Nov 25 12:12:27 crc kubenswrapper[4693]: I1125 12:12:27.484207 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p2w4j" podStartSLOduration=5.526061601 podStartE2EDuration="1m56.484186943s" podCreationTimestamp="2025-11-25 12:10:31 +0000 UTC" firstStartedPulling="2025-11-25 12:10:33.56645203 +0000 UTC m=+153.484537411" lastFinishedPulling="2025-11-25 12:12:24.524577372 +0000 UTC m=+264.442662753" observedRunningTime="2025-11-25 12:12:27.482994528 +0000 UTC m=+267.401079919" watchObservedRunningTime="2025-11-25 12:12:27.484186943 +0000 UTC m=+267.402272324" Nov 25 12:12:27 crc kubenswrapper[4693]: I1125 12:12:27.486527 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kmwwr" podStartSLOduration=5.5519179659999995 podStartE2EDuration="1m55.48651806s" podCreationTimestamp="2025-11-25 12:10:32 +0000 UTC" firstStartedPulling="2025-11-25 12:10:34.578347943 +0000 UTC m=+154.496433324" lastFinishedPulling="2025-11-25 12:12:24.512948007 +0000 UTC m=+264.431033418" observedRunningTime="2025-11-25 12:12:27.466854223 +0000 UTC m=+267.384939614" watchObservedRunningTime="2025-11-25 12:12:27.48651806 +0000 UTC m=+267.404603441" Nov 25 12:12:30 crc kubenswrapper[4693]: I1125 12:12:30.248026 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:12:30 crc kubenswrapper[4693]: I1125 12:12:30.470985 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:12:30 crc kubenswrapper[4693]: I1125 12:12:30.471051 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:12:30 crc kubenswrapper[4693]: I1125 12:12:30.518411 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:12:31 crc kubenswrapper[4693]: I1125 12:12:31.495636 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:12:32 crc kubenswrapper[4693]: I1125 12:12:32.306034 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:12:32 crc kubenswrapper[4693]: I1125 12:12:32.306092 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:12:32 crc kubenswrapper[4693]: I1125 12:12:32.345116 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:12:32 crc kubenswrapper[4693]: I1125 12:12:32.506722 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:12:33 crc kubenswrapper[4693]: I1125 12:12:33.321584 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:12:33 crc kubenswrapper[4693]: I1125 12:12:33.321943 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:12:33 crc kubenswrapper[4693]: I1125 12:12:33.390586 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:12:33 crc kubenswrapper[4693]: I1125 12:12:33.502665 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:12:33 crc kubenswrapper[4693]: I1125 12:12:33.790151 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b7mtn"] Nov 25 12:12:33 crc kubenswrapper[4693]: I1125 12:12:33.790422 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b7mtn" podUID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" containerName="registry-server" containerID="cri-o://1ded64d93c5a19985b5316ed3a4c94151e7729fc682f86eed3b678029eb09d67" gracePeriod=2 Nov 25 12:12:35 crc kubenswrapper[4693]: I1125 12:12:35.467870 4693 generic.go:334] "Generic (PLEG): container finished" podID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" containerID="1ded64d93c5a19985b5316ed3a4c94151e7729fc682f86eed3b678029eb09d67" exitCode=0 Nov 25 12:12:35 crc kubenswrapper[4693]: I1125 12:12:35.467954 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7mtn" event={"ID":"a9dd938a-f1c9-48f8-96ef-470405bfa9e4","Type":"ContainerDied","Data":"1ded64d93c5a19985b5316ed3a4c94151e7729fc682f86eed3b678029eb09d67"} Nov 25 12:12:36 crc kubenswrapper[4693]: I1125 12:12:36.195298 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2w4j"] Nov 25 12:12:36 crc kubenswrapper[4693]: I1125 12:12:36.195760 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p2w4j" podUID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" containerName="registry-server" containerID="cri-o://bc8373a95f0afe301448a386095f7b06dca925543e0345802c39f61755144da6" gracePeriod=2 Nov 25 12:12:36 crc kubenswrapper[4693]: I1125 12:12:36.398023 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kmwwr"] Nov 25 12:12:36 crc kubenswrapper[4693]: I1125 12:12:36.398473 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kmwwr" podUID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" containerName="registry-server" containerID="cri-o://1b1593b132a1ac4936510ddb465b9e6d820a397eeebf7eaa6c2ef6b5749201b3" gracePeriod=2 Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.306045 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.364227 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-catalog-content\") pod \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\" (UID: \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\") " Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.364321 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-utilities\") pod \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\" (UID: \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\") " Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.364536 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-826z7\" (UniqueName: \"kubernetes.io/projected/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-kube-api-access-826z7\") pod \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\" (UID: \"a9dd938a-f1c9-48f8-96ef-470405bfa9e4\") " Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.367144 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-utilities" (OuterVolumeSpecName: "utilities") pod "a9dd938a-f1c9-48f8-96ef-470405bfa9e4" (UID: "a9dd938a-f1c9-48f8-96ef-470405bfa9e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.371563 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-kube-api-access-826z7" (OuterVolumeSpecName: "kube-api-access-826z7") pod "a9dd938a-f1c9-48f8-96ef-470405bfa9e4" (UID: "a9dd938a-f1c9-48f8-96ef-470405bfa9e4"). InnerVolumeSpecName "kube-api-access-826z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.453934 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9dd938a-f1c9-48f8-96ef-470405bfa9e4" (UID: "a9dd938a-f1c9-48f8-96ef-470405bfa9e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.465942 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.465998 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.466012 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-826z7\" (UniqueName: \"kubernetes.io/projected/a9dd938a-f1c9-48f8-96ef-470405bfa9e4-kube-api-access-826z7\") on node \"crc\" DevicePath \"\"" Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.487486 4693 generic.go:334] "Generic (PLEG): container finished" podID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" containerID="bc8373a95f0afe301448a386095f7b06dca925543e0345802c39f61755144da6" exitCode=0 Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.487592 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w4j" event={"ID":"6b2c46bf-6877-4fe0-a0af-38c7784f7536","Type":"ContainerDied","Data":"bc8373a95f0afe301448a386095f7b06dca925543e0345802c39f61755144da6"} Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.490592 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7mtn" event={"ID":"a9dd938a-f1c9-48f8-96ef-470405bfa9e4","Type":"ContainerDied","Data":"d52842082391869602976b3fe7192060523ccd38595c7a0d774ed1a98a6676c5"} Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.490663 4693 scope.go:117] "RemoveContainer" containerID="1ded64d93c5a19985b5316ed3a4c94151e7729fc682f86eed3b678029eb09d67" Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.490695 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7mtn" Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.505696 4693 scope.go:117] "RemoveContainer" containerID="11e4ed98daa6e00bfacf4f3f4c918b4ca1a016e7eae0e7a406ca7031f732dfc7" Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.527270 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b7mtn"] Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.530530 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b7mtn"] Nov 25 12:12:37 crc kubenswrapper[4693]: I1125 12:12:37.542446 4693 scope.go:117] "RemoveContainer" containerID="635963849eb7738e978866066fc70b497981265b1fe2069a08723aaa5c1f2bd3" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.496714 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.501457 4693 generic.go:334] "Generic (PLEG): container finished" podID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" containerID="1b1593b132a1ac4936510ddb465b9e6d820a397eeebf7eaa6c2ef6b5749201b3" exitCode=0 Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.501542 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmwwr" event={"ID":"a16709b3-1b76-4143-b2e5-c3f84bd9b63b","Type":"ContainerDied","Data":"1b1593b132a1ac4936510ddb465b9e6d820a397eeebf7eaa6c2ef6b5749201b3"} Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.504275 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p2w4j" event={"ID":"6b2c46bf-6877-4fe0-a0af-38c7784f7536","Type":"ContainerDied","Data":"2597a3af106ade651d9fed92d7e55921f567399ebce66c73b9c2f1086c583825"} Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.504318 4693 scope.go:117] "RemoveContainer" containerID="bc8373a95f0afe301448a386095f7b06dca925543e0345802c39f61755144da6" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.504435 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p2w4j" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.532741 4693 scope.go:117] "RemoveContainer" containerID="964dedae3ef1124ba90ecdefe74acf740b4b59d940dd45fc82e5bbad1d886cce" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.568177 4693 scope.go:117] "RemoveContainer" containerID="ce692a694a282a70ad075fd4754de22191df3d5a47c2a96e315b8f2e23b89218" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.681758 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2c46bf-6877-4fe0-a0af-38c7784f7536-utilities\") pod \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\" (UID: \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\") " Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.681850 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2c46bf-6877-4fe0-a0af-38c7784f7536-catalog-content\") pod \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\" (UID: \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\") " Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.681921 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6675p\" (UniqueName: \"kubernetes.io/projected/6b2c46bf-6877-4fe0-a0af-38c7784f7536-kube-api-access-6675p\") pod \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\" (UID: \"6b2c46bf-6877-4fe0-a0af-38c7784f7536\") " Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.683792 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2c46bf-6877-4fe0-a0af-38c7784f7536-utilities" (OuterVolumeSpecName: "utilities") pod "6b2c46bf-6877-4fe0-a0af-38c7784f7536" (UID: "6b2c46bf-6877-4fe0-a0af-38c7784f7536"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.686498 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b2c46bf-6877-4fe0-a0af-38c7784f7536-kube-api-access-6675p" (OuterVolumeSpecName: "kube-api-access-6675p") pod "6b2c46bf-6877-4fe0-a0af-38c7784f7536" (UID: "6b2c46bf-6877-4fe0-a0af-38c7784f7536"). InnerVolumeSpecName "kube-api-access-6675p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.724862 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2c46bf-6877-4fe0-a0af-38c7784f7536-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b2c46bf-6877-4fe0-a0af-38c7784f7536" (UID: "6b2c46bf-6877-4fe0-a0af-38c7784f7536"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.783641 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2c46bf-6877-4fe0-a0af-38c7784f7536-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.784153 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2c46bf-6877-4fe0-a0af-38c7784f7536-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.784188 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6675p\" (UniqueName: \"kubernetes.io/projected/6b2c46bf-6877-4fe0-a0af-38c7784f7536-kube-api-access-6675p\") on node \"crc\" DevicePath \"\"" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.822717 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" path="/var/lib/kubelet/pods/a9dd938a-f1c9-48f8-96ef-470405bfa9e4/volumes" Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.853545 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2w4j"] Nov 25 12:12:39 crc kubenswrapper[4693]: I1125 12:12:38.861074 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p2w4j"] Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.037425 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.202418 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-catalog-content\") pod \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\" (UID: \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\") " Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.202517 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-utilities\") pod \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\" (UID: \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\") " Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.202544 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxbcg\" (UniqueName: \"kubernetes.io/projected/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-kube-api-access-mxbcg\") pod \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\" (UID: \"a16709b3-1b76-4143-b2e5-c3f84bd9b63b\") " Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.203519 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-utilities" (OuterVolumeSpecName: "utilities") pod "a16709b3-1b76-4143-b2e5-c3f84bd9b63b" (UID: "a16709b3-1b76-4143-b2e5-c3f84bd9b63b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.209561 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-kube-api-access-mxbcg" (OuterVolumeSpecName: "kube-api-access-mxbcg") pod "a16709b3-1b76-4143-b2e5-c3f84bd9b63b" (UID: "a16709b3-1b76-4143-b2e5-c3f84bd9b63b"). InnerVolumeSpecName "kube-api-access-mxbcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.295926 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a16709b3-1b76-4143-b2e5-c3f84bd9b63b" (UID: "a16709b3-1b76-4143-b2e5-c3f84bd9b63b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.304044 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.304078 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxbcg\" (UniqueName: \"kubernetes.io/projected/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-kube-api-access-mxbcg\") on node \"crc\" DevicePath \"\"" Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.304093 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a16709b3-1b76-4143-b2e5-c3f84bd9b63b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.529182 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmwwr" event={"ID":"a16709b3-1b76-4143-b2e5-c3f84bd9b63b","Type":"ContainerDied","Data":"1b89c0a4c9de46dfc60157f31357cf7de5a0cd1919cc078b1c14dcf0db1a35d0"} Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.529275 4693 scope.go:117] "RemoveContainer" containerID="1b1593b132a1ac4936510ddb465b9e6d820a397eeebf7eaa6c2ef6b5749201b3" Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.529296 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kmwwr" Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.557039 4693 scope.go:117] "RemoveContainer" containerID="894fdfc1e8713e1a1a057d97e6dd3c2ef7decbec96f45e7da12c90a44a9ddb10" Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.575588 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kmwwr"] Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.577532 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kmwwr"] Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.611542 4693 scope.go:117] "RemoveContainer" containerID="4072b84e29d82be0d202c4a40f089170c80ccaf86d2f1cdc72760003ea75958b" Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.825618 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" path="/var/lib/kubelet/pods/6b2c46bf-6877-4fe0-a0af-38c7784f7536/volumes" Nov 25 12:12:40 crc kubenswrapper[4693]: I1125 12:12:40.826922 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" path="/var/lib/kubelet/pods/a16709b3-1b76-4143-b2e5-c3f84bd9b63b/volumes" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.891586 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dv9rw"] Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.892473 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dv9rw" podUID="9fa13974-bf19-4195-af55-6dec3741828d" containerName="registry-server" containerID="cri-o://e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4" gracePeriod=30 Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.911486 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gdgjj"] Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.911823 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gdgjj" podUID="dd350aee-2f89-4b9f-ad62-454d6376a81f" containerName="registry-server" containerID="cri-o://af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16" gracePeriod=30 Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.919728 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q8bsw"] Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.919959 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" podUID="b473ce6c-f37a-472a-a1f2-89332034cdee" containerName="marketplace-operator" containerID="cri-o://ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25" gracePeriod=30 Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.926278 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfz7v"] Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.930272 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sfz7v" podUID="631e327a-21ca-4eb7-ab17-dd80766e4055" containerName="registry-server" containerID="cri-o://b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e" gracePeriod=30 Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.932143 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z99mj"] Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.932432 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z99mj" podUID="10bb8880-6561-4ce4-8ca9-847eda1571c1" containerName="registry-server" containerID="cri-o://307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6" gracePeriod=30 Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960083 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-87z2l"] Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960319 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" containerName="extract-utilities" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960336 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" containerName="extract-utilities" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960345 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b80a3f5-cc69-4710-b09a-233617258302" containerName="pruner" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960352 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b80a3f5-cc69-4710-b09a-233617258302" containerName="pruner" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960362 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" containerName="extract-utilities" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960387 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" containerName="extract-utilities" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960400 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" containerName="extract-content" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960407 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" containerName="extract-content" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960421 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" containerName="registry-server" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960428 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" containerName="registry-server" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960440 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" containerName="registry-server" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960447 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" containerName="registry-server" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960457 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" containerName="extract-utilities" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960464 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" containerName="extract-utilities" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960474 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" containerName="extract-utilities" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960482 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" containerName="extract-utilities" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960491 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" containerName="extract-content" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960497 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" containerName="extract-content" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960507 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" containerName="extract-content" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960514 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" containerName="extract-content" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960524 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd1119f-152e-4667-bf56-5c22052b29b0" containerName="pruner" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960531 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd1119f-152e-4667-bf56-5c22052b29b0" containerName="pruner" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960538 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" containerName="registry-server" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960545 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" containerName="registry-server" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960555 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" containerName="registry-server" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960562 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" containerName="registry-server" Nov 25 12:13:04 crc kubenswrapper[4693]: E1125 12:13:04.960571 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" containerName="extract-content" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960578 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" containerName="extract-content" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960674 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16709b3-1b76-4143-b2e5-c3f84bd9b63b" containerName="registry-server" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960684 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b80a3f5-cc69-4710-b09a-233617258302" containerName="pruner" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960694 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9dd938a-f1c9-48f8-96ef-470405bfa9e4" containerName="registry-server" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960705 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd1119f-152e-4667-bf56-5c22052b29b0" containerName="pruner" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960715 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c89528b4-89fc-4cc4-a2a9-b70683bbbf51" containerName="registry-server" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.960724 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2c46bf-6877-4fe0-a0af-38c7784f7536" containerName="registry-server" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.961128 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:04 crc kubenswrapper[4693]: I1125 12:13:04.985748 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-87z2l"] Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.105426 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c97ed1b2-4d1e-45f8-9aa7-67336324d2cc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-87z2l\" (UID: \"c97ed1b2-4d1e-45f8-9aa7-67336324d2cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.105768 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c97ed1b2-4d1e-45f8-9aa7-67336324d2cc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-87z2l\" (UID: \"c97ed1b2-4d1e-45f8-9aa7-67336324d2cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.105799 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9xc4\" (UniqueName: \"kubernetes.io/projected/c97ed1b2-4d1e-45f8-9aa7-67336324d2cc-kube-api-access-q9xc4\") pod \"marketplace-operator-79b997595-87z2l\" (UID: \"c97ed1b2-4d1e-45f8-9aa7-67336324d2cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.207440 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c97ed1b2-4d1e-45f8-9aa7-67336324d2cc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-87z2l\" (UID: \"c97ed1b2-4d1e-45f8-9aa7-67336324d2cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.207509 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c97ed1b2-4d1e-45f8-9aa7-67336324d2cc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-87z2l\" (UID: \"c97ed1b2-4d1e-45f8-9aa7-67336324d2cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.207552 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9xc4\" (UniqueName: \"kubernetes.io/projected/c97ed1b2-4d1e-45f8-9aa7-67336324d2cc-kube-api-access-q9xc4\") pod \"marketplace-operator-79b997595-87z2l\" (UID: \"c97ed1b2-4d1e-45f8-9aa7-67336324d2cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.212092 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c97ed1b2-4d1e-45f8-9aa7-67336324d2cc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-87z2l\" (UID: \"c97ed1b2-4d1e-45f8-9aa7-67336324d2cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.215826 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c97ed1b2-4d1e-45f8-9aa7-67336324d2cc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-87z2l\" (UID: \"c97ed1b2-4d1e-45f8-9aa7-67336324d2cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.225026 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9xc4\" (UniqueName: \"kubernetes.io/projected/c97ed1b2-4d1e-45f8-9aa7-67336324d2cc-kube-api-access-q9xc4\") pod \"marketplace-operator-79b997595-87z2l\" (UID: \"c97ed1b2-4d1e-45f8-9aa7-67336324d2cc\") " pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.319147 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.329027 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.335513 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.336461 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.337640 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.343108 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511612 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/631e327a-21ca-4eb7-ab17-dd80766e4055-catalog-content\") pod \"631e327a-21ca-4eb7-ab17-dd80766e4055\" (UID: \"631e327a-21ca-4eb7-ab17-dd80766e4055\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511649 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd350aee-2f89-4b9f-ad62-454d6376a81f-utilities\") pod \"dd350aee-2f89-4b9f-ad62-454d6376a81f\" (UID: \"dd350aee-2f89-4b9f-ad62-454d6376a81f\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511681 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bb8880-6561-4ce4-8ca9-847eda1571c1-catalog-content\") pod \"10bb8880-6561-4ce4-8ca9-847eda1571c1\" (UID: \"10bb8880-6561-4ce4-8ca9-847eda1571c1\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511706 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjt58\" (UniqueName: \"kubernetes.io/projected/10bb8880-6561-4ce4-8ca9-847eda1571c1-kube-api-access-mjt58\") pod \"10bb8880-6561-4ce4-8ca9-847eda1571c1\" (UID: \"10bb8880-6561-4ce4-8ca9-847eda1571c1\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511738 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bb8880-6561-4ce4-8ca9-847eda1571c1-utilities\") pod \"10bb8880-6561-4ce4-8ca9-847eda1571c1\" (UID: \"10bb8880-6561-4ce4-8ca9-847eda1571c1\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511760 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjfhh\" (UniqueName: \"kubernetes.io/projected/631e327a-21ca-4eb7-ab17-dd80766e4055-kube-api-access-kjfhh\") pod \"631e327a-21ca-4eb7-ab17-dd80766e4055\" (UID: \"631e327a-21ca-4eb7-ab17-dd80766e4055\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511788 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa13974-bf19-4195-af55-6dec3741828d-utilities\") pod \"9fa13974-bf19-4195-af55-6dec3741828d\" (UID: \"9fa13974-bf19-4195-af55-6dec3741828d\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511809 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa13974-bf19-4195-af55-6dec3741828d-catalog-content\") pod \"9fa13974-bf19-4195-af55-6dec3741828d\" (UID: \"9fa13974-bf19-4195-af55-6dec3741828d\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511851 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4mwx\" (UniqueName: \"kubernetes.io/projected/9fa13974-bf19-4195-af55-6dec3741828d-kube-api-access-x4mwx\") pod \"9fa13974-bf19-4195-af55-6dec3741828d\" (UID: \"9fa13974-bf19-4195-af55-6dec3741828d\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511868 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzbph\" (UniqueName: \"kubernetes.io/projected/dd350aee-2f89-4b9f-ad62-454d6376a81f-kube-api-access-rzbph\") pod \"dd350aee-2f89-4b9f-ad62-454d6376a81f\" (UID: \"dd350aee-2f89-4b9f-ad62-454d6376a81f\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511896 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/631e327a-21ca-4eb7-ab17-dd80766e4055-utilities\") pod \"631e327a-21ca-4eb7-ab17-dd80766e4055\" (UID: \"631e327a-21ca-4eb7-ab17-dd80766e4055\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511920 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m42r\" (UniqueName: \"kubernetes.io/projected/b473ce6c-f37a-472a-a1f2-89332034cdee-kube-api-access-4m42r\") pod \"b473ce6c-f37a-472a-a1f2-89332034cdee\" (UID: \"b473ce6c-f37a-472a-a1f2-89332034cdee\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511939 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b473ce6c-f37a-472a-a1f2-89332034cdee-marketplace-trusted-ca\") pod \"b473ce6c-f37a-472a-a1f2-89332034cdee\" (UID: \"b473ce6c-f37a-472a-a1f2-89332034cdee\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511964 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd350aee-2f89-4b9f-ad62-454d6376a81f-catalog-content\") pod \"dd350aee-2f89-4b9f-ad62-454d6376a81f\" (UID: \"dd350aee-2f89-4b9f-ad62-454d6376a81f\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.511996 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b473ce6c-f37a-472a-a1f2-89332034cdee-marketplace-operator-metrics\") pod \"b473ce6c-f37a-472a-a1f2-89332034cdee\" (UID: \"b473ce6c-f37a-472a-a1f2-89332034cdee\") " Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.512548 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10bb8880-6561-4ce4-8ca9-847eda1571c1-utilities" (OuterVolumeSpecName: "utilities") pod "10bb8880-6561-4ce4-8ca9-847eda1571c1" (UID: "10bb8880-6561-4ce4-8ca9-847eda1571c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.513527 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd350aee-2f89-4b9f-ad62-454d6376a81f-utilities" (OuterVolumeSpecName: "utilities") pod "dd350aee-2f89-4b9f-ad62-454d6376a81f" (UID: "dd350aee-2f89-4b9f-ad62-454d6376a81f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.514553 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fa13974-bf19-4195-af55-6dec3741828d-utilities" (OuterVolumeSpecName: "utilities") pod "9fa13974-bf19-4195-af55-6dec3741828d" (UID: "9fa13974-bf19-4195-af55-6dec3741828d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.514776 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b473ce6c-f37a-472a-a1f2-89332034cdee-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b473ce6c-f37a-472a-a1f2-89332034cdee" (UID: "b473ce6c-f37a-472a-a1f2-89332034cdee"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.515872 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/631e327a-21ca-4eb7-ab17-dd80766e4055-utilities" (OuterVolumeSpecName: "utilities") pod "631e327a-21ca-4eb7-ab17-dd80766e4055" (UID: "631e327a-21ca-4eb7-ab17-dd80766e4055"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.516644 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10bb8880-6561-4ce4-8ca9-847eda1571c1-kube-api-access-mjt58" (OuterVolumeSpecName: "kube-api-access-mjt58") pod "10bb8880-6561-4ce4-8ca9-847eda1571c1" (UID: "10bb8880-6561-4ce4-8ca9-847eda1571c1"). InnerVolumeSpecName "kube-api-access-mjt58". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.516657 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa13974-bf19-4195-af55-6dec3741828d-kube-api-access-x4mwx" (OuterVolumeSpecName: "kube-api-access-x4mwx") pod "9fa13974-bf19-4195-af55-6dec3741828d" (UID: "9fa13974-bf19-4195-af55-6dec3741828d"). InnerVolumeSpecName "kube-api-access-x4mwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.516682 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd350aee-2f89-4b9f-ad62-454d6376a81f-kube-api-access-rzbph" (OuterVolumeSpecName: "kube-api-access-rzbph") pod "dd350aee-2f89-4b9f-ad62-454d6376a81f" (UID: "dd350aee-2f89-4b9f-ad62-454d6376a81f"). InnerVolumeSpecName "kube-api-access-rzbph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.517721 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/631e327a-21ca-4eb7-ab17-dd80766e4055-kube-api-access-kjfhh" (OuterVolumeSpecName: "kube-api-access-kjfhh") pod "631e327a-21ca-4eb7-ab17-dd80766e4055" (UID: "631e327a-21ca-4eb7-ab17-dd80766e4055"). InnerVolumeSpecName "kube-api-access-kjfhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.519885 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b473ce6c-f37a-472a-a1f2-89332034cdee-kube-api-access-4m42r" (OuterVolumeSpecName: "kube-api-access-4m42r") pod "b473ce6c-f37a-472a-a1f2-89332034cdee" (UID: "b473ce6c-f37a-472a-a1f2-89332034cdee"). InnerVolumeSpecName "kube-api-access-4m42r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.530779 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b473ce6c-f37a-472a-a1f2-89332034cdee-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b473ce6c-f37a-472a-a1f2-89332034cdee" (UID: "b473ce6c-f37a-472a-a1f2-89332034cdee"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.538108 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-87z2l"] Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.541931 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/631e327a-21ca-4eb7-ab17-dd80766e4055-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "631e327a-21ca-4eb7-ab17-dd80766e4055" (UID: "631e327a-21ca-4eb7-ab17-dd80766e4055"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.586044 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fa13974-bf19-4195-af55-6dec3741828d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fa13974-bf19-4195-af55-6dec3741828d" (UID: "9fa13974-bf19-4195-af55-6dec3741828d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.604073 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd350aee-2f89-4b9f-ad62-454d6376a81f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd350aee-2f89-4b9f-ad62-454d6376a81f" (UID: "dd350aee-2f89-4b9f-ad62-454d6376a81f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613119 4693 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b473ce6c-f37a-472a-a1f2-89332034cdee-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613145 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/631e327a-21ca-4eb7-ab17-dd80766e4055-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613156 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd350aee-2f89-4b9f-ad62-454d6376a81f-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613165 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjt58\" (UniqueName: \"kubernetes.io/projected/10bb8880-6561-4ce4-8ca9-847eda1571c1-kube-api-access-mjt58\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613173 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bb8880-6561-4ce4-8ca9-847eda1571c1-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613182 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjfhh\" (UniqueName: \"kubernetes.io/projected/631e327a-21ca-4eb7-ab17-dd80766e4055-kube-api-access-kjfhh\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613209 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa13974-bf19-4195-af55-6dec3741828d-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613217 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa13974-bf19-4195-af55-6dec3741828d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613225 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4mwx\" (UniqueName: \"kubernetes.io/projected/9fa13974-bf19-4195-af55-6dec3741828d-kube-api-access-x4mwx\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613234 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzbph\" (UniqueName: \"kubernetes.io/projected/dd350aee-2f89-4b9f-ad62-454d6376a81f-kube-api-access-rzbph\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613242 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/631e327a-21ca-4eb7-ab17-dd80766e4055-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613249 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m42r\" (UniqueName: \"kubernetes.io/projected/b473ce6c-f37a-472a-a1f2-89332034cdee-kube-api-access-4m42r\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613258 4693 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b473ce6c-f37a-472a-a1f2-89332034cdee-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.613266 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd350aee-2f89-4b9f-ad62-454d6376a81f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.641745 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10bb8880-6561-4ce4-8ca9-847eda1571c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10bb8880-6561-4ce4-8ca9-847eda1571c1" (UID: "10bb8880-6561-4ce4-8ca9-847eda1571c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.710752 4693 generic.go:334] "Generic (PLEG): container finished" podID="631e327a-21ca-4eb7-ab17-dd80766e4055" containerID="b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e" exitCode=0 Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.710821 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfz7v" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.710811 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfz7v" event={"ID":"631e327a-21ca-4eb7-ab17-dd80766e4055","Type":"ContainerDied","Data":"b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e"} Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.710944 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfz7v" event={"ID":"631e327a-21ca-4eb7-ab17-dd80766e4055","Type":"ContainerDied","Data":"162848c0d9a82c5de924d123684d0905633b39ae041166bf169a4e14b61c8fa7"} Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.710968 4693 scope.go:117] "RemoveContainer" containerID="b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.712986 4693 generic.go:334] "Generic (PLEG): container finished" podID="10bb8880-6561-4ce4-8ca9-847eda1571c1" containerID="307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6" exitCode=0 Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.713088 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z99mj" event={"ID":"10bb8880-6561-4ce4-8ca9-847eda1571c1","Type":"ContainerDied","Data":"307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6"} Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.713148 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z99mj" event={"ID":"10bb8880-6561-4ce4-8ca9-847eda1571c1","Type":"ContainerDied","Data":"be6f21cc91ad834bfe08ac32ffb9e21b6c44213afb1364ce6f90ce3ae6d0e8cb"} Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.713207 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z99mj" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.715266 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bb8880-6561-4ce4-8ca9-847eda1571c1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.716966 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" event={"ID":"c97ed1b2-4d1e-45f8-9aa7-67336324d2cc","Type":"ContainerStarted","Data":"d6826861f7852590a5444390cba30b3a8f01d4c3f7cb0bdd3722b15b37983840"} Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.717014 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" event={"ID":"c97ed1b2-4d1e-45f8-9aa7-67336324d2cc","Type":"ContainerStarted","Data":"c32f075ea12a83b1727bbb496b054105f783e3b7f065d83ea358430c4b8530e7"} Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.717567 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.718763 4693 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-87z2l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.54:8080/healthz\": dial tcp 10.217.0.54:8080: connect: connection refused" start-of-body= Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.718806 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" podUID="c97ed1b2-4d1e-45f8-9aa7-67336324d2cc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.54:8080/healthz\": dial tcp 10.217.0.54:8080: connect: connection refused" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.721460 4693 generic.go:334] "Generic (PLEG): container finished" podID="dd350aee-2f89-4b9f-ad62-454d6376a81f" containerID="af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16" exitCode=0 Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.721508 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdgjj" event={"ID":"dd350aee-2f89-4b9f-ad62-454d6376a81f","Type":"ContainerDied","Data":"af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16"} Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.721531 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdgjj" event={"ID":"dd350aee-2f89-4b9f-ad62-454d6376a81f","Type":"ContainerDied","Data":"9fd3e1ef1ee46a89d8dab382de1ca25e1ac73949c31fde625eaa0d34b9348008"} Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.721595 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdgjj" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.725789 4693 generic.go:334] "Generic (PLEG): container finished" podID="b473ce6c-f37a-472a-a1f2-89332034cdee" containerID="ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25" exitCode=0 Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.725950 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.726189 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" event={"ID":"b473ce6c-f37a-472a-a1f2-89332034cdee","Type":"ContainerDied","Data":"ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25"} Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.726218 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q8bsw" event={"ID":"b473ce6c-f37a-472a-a1f2-89332034cdee","Type":"ContainerDied","Data":"47d140250ec5a46a9e6329df747ce24cc3d94f3e25b2910fdcb1b818fd72ae4c"} Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.730103 4693 generic.go:334] "Generic (PLEG): container finished" podID="9fa13974-bf19-4195-af55-6dec3741828d" containerID="e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4" exitCode=0 Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.730125 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9rw" event={"ID":"9fa13974-bf19-4195-af55-6dec3741828d","Type":"ContainerDied","Data":"e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4"} Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.730151 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9rw" event={"ID":"9fa13974-bf19-4195-af55-6dec3741828d","Type":"ContainerDied","Data":"2eeb147c5031c81c375681b647309abed069424ee07d6259d7d52179c1776060"} Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.730223 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dv9rw" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.732741 4693 scope.go:117] "RemoveContainer" containerID="8e49a9b1ff37d4ed58a08eea31d7b131b185c71a1c2ed333c9996986f6f46182" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.750144 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" podStartSLOduration=1.750120734 podStartE2EDuration="1.750120734s" podCreationTimestamp="2025-11-25 12:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:13:05.747244821 +0000 UTC m=+305.665330222" watchObservedRunningTime="2025-11-25 12:13:05.750120734 +0000 UTC m=+305.668206115" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.764532 4693 scope.go:117] "RemoveContainer" containerID="ebd0b0f6b4b71f1292dca1b6ed07e54f700c6d9327e5fbb8141ec07fd356fb0a" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.771492 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfz7v"] Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.772941 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfz7v"] Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.788407 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z99mj"] Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.796759 4693 scope.go:117] "RemoveContainer" containerID="b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e" Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.797736 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e\": container with ID starting with b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e not found: ID does not exist" containerID="b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.797779 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e"} err="failed to get container status \"b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e\": rpc error: code = NotFound desc = could not find container \"b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e\": container with ID starting with b29dc5f9743915df6fbe3cf8b52439bae487d000fcd83d0a64f43eab51f5779e not found: ID does not exist" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.797809 4693 scope.go:117] "RemoveContainer" containerID="8e49a9b1ff37d4ed58a08eea31d7b131b185c71a1c2ed333c9996986f6f46182" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.798131 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z99mj"] Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.798163 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e49a9b1ff37d4ed58a08eea31d7b131b185c71a1c2ed333c9996986f6f46182\": container with ID starting with 8e49a9b1ff37d4ed58a08eea31d7b131b185c71a1c2ed333c9996986f6f46182 not found: ID does not exist" containerID="8e49a9b1ff37d4ed58a08eea31d7b131b185c71a1c2ed333c9996986f6f46182" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.798191 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e49a9b1ff37d4ed58a08eea31d7b131b185c71a1c2ed333c9996986f6f46182"} err="failed to get container status \"8e49a9b1ff37d4ed58a08eea31d7b131b185c71a1c2ed333c9996986f6f46182\": rpc error: code = NotFound desc = could not find container \"8e49a9b1ff37d4ed58a08eea31d7b131b185c71a1c2ed333c9996986f6f46182\": container with ID starting with 8e49a9b1ff37d4ed58a08eea31d7b131b185c71a1c2ed333c9996986f6f46182 not found: ID does not exist" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.798216 4693 scope.go:117] "RemoveContainer" containerID="ebd0b0f6b4b71f1292dca1b6ed07e54f700c6d9327e5fbb8141ec07fd356fb0a" Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.798749 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd0b0f6b4b71f1292dca1b6ed07e54f700c6d9327e5fbb8141ec07fd356fb0a\": container with ID starting with ebd0b0f6b4b71f1292dca1b6ed07e54f700c6d9327e5fbb8141ec07fd356fb0a not found: ID does not exist" containerID="ebd0b0f6b4b71f1292dca1b6ed07e54f700c6d9327e5fbb8141ec07fd356fb0a" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.798777 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd0b0f6b4b71f1292dca1b6ed07e54f700c6d9327e5fbb8141ec07fd356fb0a"} err="failed to get container status \"ebd0b0f6b4b71f1292dca1b6ed07e54f700c6d9327e5fbb8141ec07fd356fb0a\": rpc error: code = NotFound desc = could not find container \"ebd0b0f6b4b71f1292dca1b6ed07e54f700c6d9327e5fbb8141ec07fd356fb0a\": container with ID starting with ebd0b0f6b4b71f1292dca1b6ed07e54f700c6d9327e5fbb8141ec07fd356fb0a not found: ID does not exist" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.798794 4693 scope.go:117] "RemoveContainer" containerID="307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.807904 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dv9rw"] Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.816915 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dv9rw"] Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.821553 4693 scope.go:117] "RemoveContainer" containerID="60b2eca352e1dc15ab7927e14d84d379887e587de13e2404c5b2b89614d8615d" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.822860 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q8bsw"] Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.827489 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q8bsw"] Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.834451 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gdgjj"] Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.839903 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gdgjj"] Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.844075 4693 scope.go:117] "RemoveContainer" containerID="1f9395f498c6ab7e4bcb26e97e2749f8a7d4a769d45b9f506cb46f5a504a9013" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.856966 4693 scope.go:117] "RemoveContainer" containerID="307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6" Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.857300 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6\": container with ID starting with 307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6 not found: ID does not exist" containerID="307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.857359 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6"} err="failed to get container status \"307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6\": rpc error: code = NotFound desc = could not find container \"307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6\": container with ID starting with 307cd03e1a8cad826f897b8bc43a8b69645c1c2e7ef191b5e786b6e6c31a53c6 not found: ID does not exist" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.857424 4693 scope.go:117] "RemoveContainer" containerID="60b2eca352e1dc15ab7927e14d84d379887e587de13e2404c5b2b89614d8615d" Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.857986 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b2eca352e1dc15ab7927e14d84d379887e587de13e2404c5b2b89614d8615d\": container with ID starting with 60b2eca352e1dc15ab7927e14d84d379887e587de13e2404c5b2b89614d8615d not found: ID does not exist" containerID="60b2eca352e1dc15ab7927e14d84d379887e587de13e2404c5b2b89614d8615d" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.858035 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b2eca352e1dc15ab7927e14d84d379887e587de13e2404c5b2b89614d8615d"} err="failed to get container status \"60b2eca352e1dc15ab7927e14d84d379887e587de13e2404c5b2b89614d8615d\": rpc error: code = NotFound desc = could not find container \"60b2eca352e1dc15ab7927e14d84d379887e587de13e2404c5b2b89614d8615d\": container with ID starting with 60b2eca352e1dc15ab7927e14d84d379887e587de13e2404c5b2b89614d8615d not found: ID does not exist" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.858078 4693 scope.go:117] "RemoveContainer" containerID="1f9395f498c6ab7e4bcb26e97e2749f8a7d4a769d45b9f506cb46f5a504a9013" Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.858394 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f9395f498c6ab7e4bcb26e97e2749f8a7d4a769d45b9f506cb46f5a504a9013\": container with ID starting with 1f9395f498c6ab7e4bcb26e97e2749f8a7d4a769d45b9f506cb46f5a504a9013 not found: ID does not exist" containerID="1f9395f498c6ab7e4bcb26e97e2749f8a7d4a769d45b9f506cb46f5a504a9013" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.858425 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f9395f498c6ab7e4bcb26e97e2749f8a7d4a769d45b9f506cb46f5a504a9013"} err="failed to get container status \"1f9395f498c6ab7e4bcb26e97e2749f8a7d4a769d45b9f506cb46f5a504a9013\": rpc error: code = NotFound desc = could not find container \"1f9395f498c6ab7e4bcb26e97e2749f8a7d4a769d45b9f506cb46f5a504a9013\": container with ID starting with 1f9395f498c6ab7e4bcb26e97e2749f8a7d4a769d45b9f506cb46f5a504a9013 not found: ID does not exist" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.858444 4693 scope.go:117] "RemoveContainer" containerID="af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.875204 4693 scope.go:117] "RemoveContainer" containerID="4c1e53ff95c329ce4c69d96603d92e247d4c48ec287086cdc6cdc6517730271c" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.889303 4693 scope.go:117] "RemoveContainer" containerID="7e2bff4f1f091ecb1774f0fe91be1b5ed4f4980a20e6ba77a2c4d45bd74b205d" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.904340 4693 scope.go:117] "RemoveContainer" containerID="af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16" Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.904750 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16\": container with ID starting with af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16 not found: ID does not exist" containerID="af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.904775 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16"} err="failed to get container status \"af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16\": rpc error: code = NotFound desc = could not find container \"af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16\": container with ID starting with af31617de43dd6ccbdabf7e16bde0b353b4d28ad6b53ccf9873f8a199aeb6f16 not found: ID does not exist" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.904795 4693 scope.go:117] "RemoveContainer" containerID="4c1e53ff95c329ce4c69d96603d92e247d4c48ec287086cdc6cdc6517730271c" Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.905348 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c1e53ff95c329ce4c69d96603d92e247d4c48ec287086cdc6cdc6517730271c\": container with ID starting with 4c1e53ff95c329ce4c69d96603d92e247d4c48ec287086cdc6cdc6517730271c not found: ID does not exist" containerID="4c1e53ff95c329ce4c69d96603d92e247d4c48ec287086cdc6cdc6517730271c" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.905388 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c1e53ff95c329ce4c69d96603d92e247d4c48ec287086cdc6cdc6517730271c"} err="failed to get container status \"4c1e53ff95c329ce4c69d96603d92e247d4c48ec287086cdc6cdc6517730271c\": rpc error: code = NotFound desc = could not find container \"4c1e53ff95c329ce4c69d96603d92e247d4c48ec287086cdc6cdc6517730271c\": container with ID starting with 4c1e53ff95c329ce4c69d96603d92e247d4c48ec287086cdc6cdc6517730271c not found: ID does not exist" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.905401 4693 scope.go:117] "RemoveContainer" containerID="7e2bff4f1f091ecb1774f0fe91be1b5ed4f4980a20e6ba77a2c4d45bd74b205d" Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.905949 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e2bff4f1f091ecb1774f0fe91be1b5ed4f4980a20e6ba77a2c4d45bd74b205d\": container with ID starting with 7e2bff4f1f091ecb1774f0fe91be1b5ed4f4980a20e6ba77a2c4d45bd74b205d not found: ID does not exist" containerID="7e2bff4f1f091ecb1774f0fe91be1b5ed4f4980a20e6ba77a2c4d45bd74b205d" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.905969 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2bff4f1f091ecb1774f0fe91be1b5ed4f4980a20e6ba77a2c4d45bd74b205d"} err="failed to get container status \"7e2bff4f1f091ecb1774f0fe91be1b5ed4f4980a20e6ba77a2c4d45bd74b205d\": rpc error: code = NotFound desc = could not find container \"7e2bff4f1f091ecb1774f0fe91be1b5ed4f4980a20e6ba77a2c4d45bd74b205d\": container with ID starting with 7e2bff4f1f091ecb1774f0fe91be1b5ed4f4980a20e6ba77a2c4d45bd74b205d not found: ID does not exist" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.905982 4693 scope.go:117] "RemoveContainer" containerID="ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.936318 4693 scope.go:117] "RemoveContainer" containerID="ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25" Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.937094 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25\": container with ID starting with ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25 not found: ID does not exist" containerID="ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.937136 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25"} err="failed to get container status \"ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25\": rpc error: code = NotFound desc = could not find container \"ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25\": container with ID starting with ab7cfc88f7dc993071d51ac24f720f7d150f34d9f09d874a47a5ee2e39a0cd25 not found: ID does not exist" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.937165 4693 scope.go:117] "RemoveContainer" containerID="e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.958603 4693 scope.go:117] "RemoveContainer" containerID="09acacf774eb40f876c81514f2a2c73b7f0bbf54135a795af95c079b5296f745" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.975351 4693 scope.go:117] "RemoveContainer" containerID="b369e877c242671d5e8b990f5efa698a19177c45f532992f4f8330f604893d31" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.997059 4693 scope.go:117] "RemoveContainer" containerID="e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4" Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.997446 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4\": container with ID starting with e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4 not found: ID does not exist" containerID="e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.997496 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4"} err="failed to get container status \"e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4\": rpc error: code = NotFound desc = could not find container \"e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4\": container with ID starting with e586989a954e7e785376b8c001af049444bf5288b463e9e71344dfeaaf3162a4 not found: ID does not exist" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.997530 4693 scope.go:117] "RemoveContainer" containerID="09acacf774eb40f876c81514f2a2c73b7f0bbf54135a795af95c079b5296f745" Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.997821 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09acacf774eb40f876c81514f2a2c73b7f0bbf54135a795af95c079b5296f745\": container with ID starting with 09acacf774eb40f876c81514f2a2c73b7f0bbf54135a795af95c079b5296f745 not found: ID does not exist" containerID="09acacf774eb40f876c81514f2a2c73b7f0bbf54135a795af95c079b5296f745" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.997863 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09acacf774eb40f876c81514f2a2c73b7f0bbf54135a795af95c079b5296f745"} err="failed to get container status \"09acacf774eb40f876c81514f2a2c73b7f0bbf54135a795af95c079b5296f745\": rpc error: code = NotFound desc = could not find container \"09acacf774eb40f876c81514f2a2c73b7f0bbf54135a795af95c079b5296f745\": container with ID starting with 09acacf774eb40f876c81514f2a2c73b7f0bbf54135a795af95c079b5296f745 not found: ID does not exist" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.997889 4693 scope.go:117] "RemoveContainer" containerID="b369e877c242671d5e8b990f5efa698a19177c45f532992f4f8330f604893d31" Nov 25 12:13:05 crc kubenswrapper[4693]: E1125 12:13:05.998209 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b369e877c242671d5e8b990f5efa698a19177c45f532992f4f8330f604893d31\": container with ID starting with b369e877c242671d5e8b990f5efa698a19177c45f532992f4f8330f604893d31 not found: ID does not exist" containerID="b369e877c242671d5e8b990f5efa698a19177c45f532992f4f8330f604893d31" Nov 25 12:13:05 crc kubenswrapper[4693]: I1125 12:13:05.998279 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b369e877c242671d5e8b990f5efa698a19177c45f532992f4f8330f604893d31"} err="failed to get container status \"b369e877c242671d5e8b990f5efa698a19177c45f532992f4f8330f604893d31\": rpc error: code = NotFound desc = could not find container \"b369e877c242671d5e8b990f5efa698a19177c45f532992f4f8330f604893d31\": container with ID starting with b369e877c242671d5e8b990f5efa698a19177c45f532992f4f8330f604893d31 not found: ID does not exist" Nov 25 12:13:06 crc kubenswrapper[4693]: I1125 12:13:06.023039 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zm7pc"] Nov 25 12:13:06 crc kubenswrapper[4693]: I1125 12:13:06.745387 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-87z2l" Nov 25 12:13:06 crc kubenswrapper[4693]: I1125 12:13:06.820801 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10bb8880-6561-4ce4-8ca9-847eda1571c1" path="/var/lib/kubelet/pods/10bb8880-6561-4ce4-8ca9-847eda1571c1/volumes" Nov 25 12:13:06 crc kubenswrapper[4693]: I1125 12:13:06.822095 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="631e327a-21ca-4eb7-ab17-dd80766e4055" path="/var/lib/kubelet/pods/631e327a-21ca-4eb7-ab17-dd80766e4055/volumes" Nov 25 12:13:06 crc kubenswrapper[4693]: I1125 12:13:06.823167 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa13974-bf19-4195-af55-6dec3741828d" path="/var/lib/kubelet/pods/9fa13974-bf19-4195-af55-6dec3741828d/volumes" Nov 25 12:13:06 crc kubenswrapper[4693]: I1125 12:13:06.824960 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b473ce6c-f37a-472a-a1f2-89332034cdee" path="/var/lib/kubelet/pods/b473ce6c-f37a-472a-a1f2-89332034cdee/volumes" Nov 25 12:13:06 crc kubenswrapper[4693]: I1125 12:13:06.825593 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd350aee-2f89-4b9f-ad62-454d6376a81f" path="/var/lib/kubelet/pods/dd350aee-2f89-4b9f-ad62-454d6376a81f/volumes" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.111714 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rhsc2"] Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.111985 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bb8880-6561-4ce4-8ca9-847eda1571c1" containerName="extract-content" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112005 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bb8880-6561-4ce4-8ca9-847eda1571c1" containerName="extract-content" Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.112022 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631e327a-21ca-4eb7-ab17-dd80766e4055" containerName="registry-server" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112034 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="631e327a-21ca-4eb7-ab17-dd80766e4055" containerName="registry-server" Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.112054 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631e327a-21ca-4eb7-ab17-dd80766e4055" containerName="extract-utilities" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112067 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="631e327a-21ca-4eb7-ab17-dd80766e4055" containerName="extract-utilities" Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.112087 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631e327a-21ca-4eb7-ab17-dd80766e4055" containerName="extract-content" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112099 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="631e327a-21ca-4eb7-ab17-dd80766e4055" containerName="extract-content" Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.112119 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bb8880-6561-4ce4-8ca9-847eda1571c1" containerName="registry-server" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112131 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bb8880-6561-4ce4-8ca9-847eda1571c1" containerName="registry-server" Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.112147 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa13974-bf19-4195-af55-6dec3741828d" containerName="registry-server" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112159 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa13974-bf19-4195-af55-6dec3741828d" containerName="registry-server" Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.112175 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa13974-bf19-4195-af55-6dec3741828d" containerName="extract-utilities" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112187 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa13974-bf19-4195-af55-6dec3741828d" containerName="extract-utilities" Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.112201 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd350aee-2f89-4b9f-ad62-454d6376a81f" containerName="extract-content" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112212 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd350aee-2f89-4b9f-ad62-454d6376a81f" containerName="extract-content" Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.112236 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bb8880-6561-4ce4-8ca9-847eda1571c1" containerName="extract-utilities" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112249 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bb8880-6561-4ce4-8ca9-847eda1571c1" containerName="extract-utilities" Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.112266 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b473ce6c-f37a-472a-a1f2-89332034cdee" containerName="marketplace-operator" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112278 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="b473ce6c-f37a-472a-a1f2-89332034cdee" containerName="marketplace-operator" Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.112292 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd350aee-2f89-4b9f-ad62-454d6376a81f" containerName="extract-utilities" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112304 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd350aee-2f89-4b9f-ad62-454d6376a81f" containerName="extract-utilities" Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.112322 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa13974-bf19-4195-af55-6dec3741828d" containerName="extract-content" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112334 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa13974-bf19-4195-af55-6dec3741828d" containerName="extract-content" Nov 25 12:13:07 crc kubenswrapper[4693]: E1125 12:13:07.112351 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd350aee-2f89-4b9f-ad62-454d6376a81f" containerName="registry-server" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112362 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd350aee-2f89-4b9f-ad62-454d6376a81f" containerName="registry-server" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112552 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa13974-bf19-4195-af55-6dec3741828d" containerName="registry-server" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112572 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd350aee-2f89-4b9f-ad62-454d6376a81f" containerName="registry-server" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112591 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="631e327a-21ca-4eb7-ab17-dd80766e4055" containerName="registry-server" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112608 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="10bb8880-6561-4ce4-8ca9-847eda1571c1" containerName="registry-server" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.112629 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="b473ce6c-f37a-472a-a1f2-89332034cdee" containerName="marketplace-operator" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.113790 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.122464 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhsc2"] Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.123784 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.132471 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c22ae76-cd45-47d2-bc53-c9e3d6026512-utilities\") pod \"redhat-marketplace-rhsc2\" (UID: \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\") " pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.132529 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c22ae76-cd45-47d2-bc53-c9e3d6026512-catalog-content\") pod \"redhat-marketplace-rhsc2\" (UID: \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\") " pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.132570 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2s47\" (UniqueName: \"kubernetes.io/projected/0c22ae76-cd45-47d2-bc53-c9e3d6026512-kube-api-access-j2s47\") pod \"redhat-marketplace-rhsc2\" (UID: \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\") " pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.233272 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2s47\" (UniqueName: \"kubernetes.io/projected/0c22ae76-cd45-47d2-bc53-c9e3d6026512-kube-api-access-j2s47\") pod \"redhat-marketplace-rhsc2\" (UID: \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\") " pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.233523 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c22ae76-cd45-47d2-bc53-c9e3d6026512-utilities\") pod \"redhat-marketplace-rhsc2\" (UID: \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\") " pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.233612 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c22ae76-cd45-47d2-bc53-c9e3d6026512-catalog-content\") pod \"redhat-marketplace-rhsc2\" (UID: \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\") " pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.233970 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c22ae76-cd45-47d2-bc53-c9e3d6026512-utilities\") pod \"redhat-marketplace-rhsc2\" (UID: \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\") " pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.234130 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c22ae76-cd45-47d2-bc53-c9e3d6026512-catalog-content\") pod \"redhat-marketplace-rhsc2\" (UID: \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\") " pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.254797 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2s47\" (UniqueName: \"kubernetes.io/projected/0c22ae76-cd45-47d2-bc53-c9e3d6026512-kube-api-access-j2s47\") pod \"redhat-marketplace-rhsc2\" (UID: \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\") " pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.306230 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hrqcz"] Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.307409 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.308839 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.315699 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hrqcz"] Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.439759 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.475271 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c79042-9553-4044-845a-a164e846c298-utilities\") pod \"redhat-operators-hrqcz\" (UID: \"70c79042-9553-4044-845a-a164e846c298\") " pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.475519 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfg92\" (UniqueName: \"kubernetes.io/projected/70c79042-9553-4044-845a-a164e846c298-kube-api-access-xfg92\") pod \"redhat-operators-hrqcz\" (UID: \"70c79042-9553-4044-845a-a164e846c298\") " pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.475604 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c79042-9553-4044-845a-a164e846c298-catalog-content\") pod \"redhat-operators-hrqcz\" (UID: \"70c79042-9553-4044-845a-a164e846c298\") " pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.576602 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c79042-9553-4044-845a-a164e846c298-utilities\") pod \"redhat-operators-hrqcz\" (UID: \"70c79042-9553-4044-845a-a164e846c298\") " pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.576658 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfg92\" (UniqueName: \"kubernetes.io/projected/70c79042-9553-4044-845a-a164e846c298-kube-api-access-xfg92\") pod \"redhat-operators-hrqcz\" (UID: \"70c79042-9553-4044-845a-a164e846c298\") " pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.576686 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c79042-9553-4044-845a-a164e846c298-catalog-content\") pod \"redhat-operators-hrqcz\" (UID: \"70c79042-9553-4044-845a-a164e846c298\") " pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.577171 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c79042-9553-4044-845a-a164e846c298-utilities\") pod \"redhat-operators-hrqcz\" (UID: \"70c79042-9553-4044-845a-a164e846c298\") " pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.577192 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c79042-9553-4044-845a-a164e846c298-catalog-content\") pod \"redhat-operators-hrqcz\" (UID: \"70c79042-9553-4044-845a-a164e846c298\") " pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.597962 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfg92\" (UniqueName: \"kubernetes.io/projected/70c79042-9553-4044-845a-a164e846c298-kube-api-access-xfg92\") pod \"redhat-operators-hrqcz\" (UID: \"70c79042-9553-4044-845a-a164e846c298\") " pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.599246 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhsc2"] Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.679618 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.767566 4693 generic.go:334] "Generic (PLEG): container finished" podID="0c22ae76-cd45-47d2-bc53-c9e3d6026512" containerID="fb8cbfe6030020fe580364e910ad733bbfe900552e91877a4b69c29b053b302c" exitCode=0 Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.767752 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhsc2" event={"ID":"0c22ae76-cd45-47d2-bc53-c9e3d6026512","Type":"ContainerDied","Data":"fb8cbfe6030020fe580364e910ad733bbfe900552e91877a4b69c29b053b302c"} Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.771090 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhsc2" event={"ID":"0c22ae76-cd45-47d2-bc53-c9e3d6026512","Type":"ContainerStarted","Data":"e7a321d24a2f09e286386ca9fc534e4bef7f5d61805627ad0a7af4f2894ef904"} Nov 25 12:13:07 crc kubenswrapper[4693]: I1125 12:13:07.893184 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hrqcz"] Nov 25 12:13:07 crc kubenswrapper[4693]: W1125 12:13:07.895021 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70c79042_9553_4044_845a_a164e846c298.slice/crio-1e109e98ed2e4889b62c7ac2bacdad6997b500c354d50c47bd60fed2389a2528 WatchSource:0}: Error finding container 1e109e98ed2e4889b62c7ac2bacdad6997b500c354d50c47bd60fed2389a2528: Status 404 returned error can't find the container with id 1e109e98ed2e4889b62c7ac2bacdad6997b500c354d50c47bd60fed2389a2528 Nov 25 12:13:08 crc kubenswrapper[4693]: I1125 12:13:08.775835 4693 generic.go:334] "Generic (PLEG): container finished" podID="70c79042-9553-4044-845a-a164e846c298" containerID="eb7f80df34a6e6f65f75292a1f6f08b264fe7f52322a3c1d87c013657e7db7b6" exitCode=0 Nov 25 12:13:08 crc kubenswrapper[4693]: I1125 12:13:08.775905 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrqcz" event={"ID":"70c79042-9553-4044-845a-a164e846c298","Type":"ContainerDied","Data":"eb7f80df34a6e6f65f75292a1f6f08b264fe7f52322a3c1d87c013657e7db7b6"} Nov 25 12:13:08 crc kubenswrapper[4693]: I1125 12:13:08.776208 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrqcz" event={"ID":"70c79042-9553-4044-845a-a164e846c298","Type":"ContainerStarted","Data":"1e109e98ed2e4889b62c7ac2bacdad6997b500c354d50c47bd60fed2389a2528"} Nov 25 12:13:08 crc kubenswrapper[4693]: I1125 12:13:08.781084 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhsc2" event={"ID":"0c22ae76-cd45-47d2-bc53-c9e3d6026512","Type":"ContainerStarted","Data":"ad56b976e35142af1d5a3b84d9b3ce3b60744f85e34549b73e4d4dc094eaaf1c"} Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.521756 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bd7mr"] Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.527923 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.528040 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bd7mr"] Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.531561 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.713001 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qkzhq"] Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.714132 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.716814 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca5d5dc-02ea-48c2-9a3e-944359d44d84-utilities\") pod \"certified-operators-bd7mr\" (UID: \"5ca5d5dc-02ea-48c2-9a3e-944359d44d84\") " pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.716858 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfmjk\" (UniqueName: \"kubernetes.io/projected/5ca5d5dc-02ea-48c2-9a3e-944359d44d84-kube-api-access-dfmjk\") pod \"certified-operators-bd7mr\" (UID: \"5ca5d5dc-02ea-48c2-9a3e-944359d44d84\") " pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.716893 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca5d5dc-02ea-48c2-9a3e-944359d44d84-catalog-content\") pod \"certified-operators-bd7mr\" (UID: \"5ca5d5dc-02ea-48c2-9a3e-944359d44d84\") " pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.718903 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.724254 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qkzhq"] Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.795511 4693 generic.go:334] "Generic (PLEG): container finished" podID="0c22ae76-cd45-47d2-bc53-c9e3d6026512" containerID="ad56b976e35142af1d5a3b84d9b3ce3b60744f85e34549b73e4d4dc094eaaf1c" exitCode=0 Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.795574 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhsc2" event={"ID":"0c22ae76-cd45-47d2-bc53-c9e3d6026512","Type":"ContainerDied","Data":"ad56b976e35142af1d5a3b84d9b3ce3b60744f85e34549b73e4d4dc094eaaf1c"} Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.818421 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24c65baa-b858-4f7f-8d19-a2e6ce7019a6-catalog-content\") pod \"community-operators-qkzhq\" (UID: \"24c65baa-b858-4f7f-8d19-a2e6ce7019a6\") " pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.818506 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbqjv\" (UniqueName: \"kubernetes.io/projected/24c65baa-b858-4f7f-8d19-a2e6ce7019a6-kube-api-access-tbqjv\") pod \"community-operators-qkzhq\" (UID: \"24c65baa-b858-4f7f-8d19-a2e6ce7019a6\") " pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.818584 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24c65baa-b858-4f7f-8d19-a2e6ce7019a6-utilities\") pod \"community-operators-qkzhq\" (UID: \"24c65baa-b858-4f7f-8d19-a2e6ce7019a6\") " pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.818617 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca5d5dc-02ea-48c2-9a3e-944359d44d84-utilities\") pod \"certified-operators-bd7mr\" (UID: \"5ca5d5dc-02ea-48c2-9a3e-944359d44d84\") " pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.818694 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfmjk\" (UniqueName: \"kubernetes.io/projected/5ca5d5dc-02ea-48c2-9a3e-944359d44d84-kube-api-access-dfmjk\") pod \"certified-operators-bd7mr\" (UID: \"5ca5d5dc-02ea-48c2-9a3e-944359d44d84\") " pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.818756 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca5d5dc-02ea-48c2-9a3e-944359d44d84-catalog-content\") pod \"certified-operators-bd7mr\" (UID: \"5ca5d5dc-02ea-48c2-9a3e-944359d44d84\") " pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.819409 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca5d5dc-02ea-48c2-9a3e-944359d44d84-utilities\") pod \"certified-operators-bd7mr\" (UID: \"5ca5d5dc-02ea-48c2-9a3e-944359d44d84\") " pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.819536 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca5d5dc-02ea-48c2-9a3e-944359d44d84-catalog-content\") pod \"certified-operators-bd7mr\" (UID: \"5ca5d5dc-02ea-48c2-9a3e-944359d44d84\") " pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.841877 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfmjk\" (UniqueName: \"kubernetes.io/projected/5ca5d5dc-02ea-48c2-9a3e-944359d44d84-kube-api-access-dfmjk\") pod \"certified-operators-bd7mr\" (UID: \"5ca5d5dc-02ea-48c2-9a3e-944359d44d84\") " pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.855097 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.920700 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24c65baa-b858-4f7f-8d19-a2e6ce7019a6-catalog-content\") pod \"community-operators-qkzhq\" (UID: \"24c65baa-b858-4f7f-8d19-a2e6ce7019a6\") " pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.920810 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbqjv\" (UniqueName: \"kubernetes.io/projected/24c65baa-b858-4f7f-8d19-a2e6ce7019a6-kube-api-access-tbqjv\") pod \"community-operators-qkzhq\" (UID: \"24c65baa-b858-4f7f-8d19-a2e6ce7019a6\") " pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.920963 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24c65baa-b858-4f7f-8d19-a2e6ce7019a6-utilities\") pod \"community-operators-qkzhq\" (UID: \"24c65baa-b858-4f7f-8d19-a2e6ce7019a6\") " pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.921366 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24c65baa-b858-4f7f-8d19-a2e6ce7019a6-catalog-content\") pod \"community-operators-qkzhq\" (UID: \"24c65baa-b858-4f7f-8d19-a2e6ce7019a6\") " pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.921803 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24c65baa-b858-4f7f-8d19-a2e6ce7019a6-utilities\") pod \"community-operators-qkzhq\" (UID: \"24c65baa-b858-4f7f-8d19-a2e6ce7019a6\") " pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:09 crc kubenswrapper[4693]: I1125 12:13:09.942413 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbqjv\" (UniqueName: \"kubernetes.io/projected/24c65baa-b858-4f7f-8d19-a2e6ce7019a6-kube-api-access-tbqjv\") pod \"community-operators-qkzhq\" (UID: \"24c65baa-b858-4f7f-8d19-a2e6ce7019a6\") " pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.039529 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bd7mr"] Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.074205 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.281108 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qkzhq"] Nov 25 12:13:10 crc kubenswrapper[4693]: W1125 12:13:10.301477 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24c65baa_b858_4f7f_8d19_a2e6ce7019a6.slice/crio-56761c48e671d34d6805bc4d3dc742f9f6978c40e3aa27a4e943d64fec9233dd WatchSource:0}: Error finding container 56761c48e671d34d6805bc4d3dc742f9f6978c40e3aa27a4e943d64fec9233dd: Status 404 returned error can't find the container with id 56761c48e671d34d6805bc4d3dc742f9f6978c40e3aa27a4e943d64fec9233dd Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.801321 4693 generic.go:334] "Generic (PLEG): container finished" podID="24c65baa-b858-4f7f-8d19-a2e6ce7019a6" containerID="5b782a27a23c9216f5bd51ac565958ba541f0e6c46fd403aa07abbf59d41be17" exitCode=0 Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.801471 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qkzhq" event={"ID":"24c65baa-b858-4f7f-8d19-a2e6ce7019a6","Type":"ContainerDied","Data":"5b782a27a23c9216f5bd51ac565958ba541f0e6c46fd403aa07abbf59d41be17"} Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.801787 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qkzhq" event={"ID":"24c65baa-b858-4f7f-8d19-a2e6ce7019a6","Type":"ContainerStarted","Data":"56761c48e671d34d6805bc4d3dc742f9f6978c40e3aa27a4e943d64fec9233dd"} Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.806084 4693 generic.go:334] "Generic (PLEG): container finished" podID="70c79042-9553-4044-845a-a164e846c298" containerID="1e0c9dd11b4c0e5e911405d4424fb180eaea3d30df69292478dbe22ba83208c8" exitCode=0 Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.806163 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrqcz" event={"ID":"70c79042-9553-4044-845a-a164e846c298","Type":"ContainerDied","Data":"1e0c9dd11b4c0e5e911405d4424fb180eaea3d30df69292478dbe22ba83208c8"} Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.808000 4693 generic.go:334] "Generic (PLEG): container finished" podID="5ca5d5dc-02ea-48c2-9a3e-944359d44d84" containerID="88021c8dbc2e2c734eb2ce3e0f4b1001849110e7ba2694289c837a99c69ef6b4" exitCode=0 Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.808068 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd7mr" event={"ID":"5ca5d5dc-02ea-48c2-9a3e-944359d44d84","Type":"ContainerDied","Data":"88021c8dbc2e2c734eb2ce3e0f4b1001849110e7ba2694289c837a99c69ef6b4"} Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.808097 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd7mr" event={"ID":"5ca5d5dc-02ea-48c2-9a3e-944359d44d84","Type":"ContainerStarted","Data":"402234ec1584beae96ecbd1b3eeb2de35d3f406e5026ac1dd31b0dfd077a1a69"} Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.821480 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhsc2" event={"ID":"0c22ae76-cd45-47d2-bc53-c9e3d6026512","Type":"ContainerStarted","Data":"99aedac81dfc49e5e8b6f52a559bd21dcf08f78361b31a5b595fa450b6d9b912"} Nov 25 12:13:10 crc kubenswrapper[4693]: I1125 12:13:10.872587 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rhsc2" podStartSLOduration=1.435894676 podStartE2EDuration="3.872353539s" podCreationTimestamp="2025-11-25 12:13:07 +0000 UTC" firstStartedPulling="2025-11-25 12:13:07.781000841 +0000 UTC m=+307.699086222" lastFinishedPulling="2025-11-25 12:13:10.217459704 +0000 UTC m=+310.135545085" observedRunningTime="2025-11-25 12:13:10.87028178 +0000 UTC m=+310.788367161" watchObservedRunningTime="2025-11-25 12:13:10.872353539 +0000 UTC m=+310.790438920" Nov 25 12:13:11 crc kubenswrapper[4693]: I1125 12:13:11.821478 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrqcz" event={"ID":"70c79042-9553-4044-845a-a164e846c298","Type":"ContainerStarted","Data":"abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2"} Nov 25 12:13:11 crc kubenswrapper[4693]: I1125 12:13:11.824826 4693 generic.go:334] "Generic (PLEG): container finished" podID="5ca5d5dc-02ea-48c2-9a3e-944359d44d84" containerID="fd86ff2bd08f06cab58e229b652d49643692379e2e8c9009da9d8486f3920ad7" exitCode=0 Nov 25 12:13:11 crc kubenswrapper[4693]: I1125 12:13:11.824916 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd7mr" event={"ID":"5ca5d5dc-02ea-48c2-9a3e-944359d44d84","Type":"ContainerDied","Data":"fd86ff2bd08f06cab58e229b652d49643692379e2e8c9009da9d8486f3920ad7"} Nov 25 12:13:11 crc kubenswrapper[4693]: I1125 12:13:11.865634 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hrqcz" podStartSLOduration=2.363859967 podStartE2EDuration="4.865617583s" podCreationTimestamp="2025-11-25 12:13:07 +0000 UTC" firstStartedPulling="2025-11-25 12:13:08.777445916 +0000 UTC m=+308.695531317" lastFinishedPulling="2025-11-25 12:13:11.279203542 +0000 UTC m=+311.197288933" observedRunningTime="2025-11-25 12:13:11.844823494 +0000 UTC m=+311.762908875" watchObservedRunningTime="2025-11-25 12:13:11.865617583 +0000 UTC m=+311.783702964" Nov 25 12:13:12 crc kubenswrapper[4693]: I1125 12:13:12.833806 4693 generic.go:334] "Generic (PLEG): container finished" podID="24c65baa-b858-4f7f-8d19-a2e6ce7019a6" containerID="daaef2ddc1cec194d594391447dfddb04a601023bf37500543b3d645db823af7" exitCode=0 Nov 25 12:13:12 crc kubenswrapper[4693]: I1125 12:13:12.833908 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qkzhq" event={"ID":"24c65baa-b858-4f7f-8d19-a2e6ce7019a6","Type":"ContainerDied","Data":"daaef2ddc1cec194d594391447dfddb04a601023bf37500543b3d645db823af7"} Nov 25 12:13:12 crc kubenswrapper[4693]: I1125 12:13:12.840848 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bd7mr" event={"ID":"5ca5d5dc-02ea-48c2-9a3e-944359d44d84","Type":"ContainerStarted","Data":"bb8c29a6eb39e0a10e2fff1276595fd2be72a4d6f09aa2d344e9f18da8abfbd6"} Nov 25 12:13:12 crc kubenswrapper[4693]: I1125 12:13:12.881792 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bd7mr" podStartSLOduration=2.345818334 podStartE2EDuration="3.881771998s" podCreationTimestamp="2025-11-25 12:13:09 +0000 UTC" firstStartedPulling="2025-11-25 12:13:10.809892388 +0000 UTC m=+310.727977759" lastFinishedPulling="2025-11-25 12:13:12.345846032 +0000 UTC m=+312.263931423" observedRunningTime="2025-11-25 12:13:12.880647005 +0000 UTC m=+312.798732386" watchObservedRunningTime="2025-11-25 12:13:12.881771998 +0000 UTC m=+312.799857379" Nov 25 12:13:14 crc kubenswrapper[4693]: I1125 12:13:14.851988 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qkzhq" event={"ID":"24c65baa-b858-4f7f-8d19-a2e6ce7019a6","Type":"ContainerStarted","Data":"b132c5430d28aec11031bda8a55a6cafba8d3bd29b23c44c8772a6026a9988b4"} Nov 25 12:13:14 crc kubenswrapper[4693]: I1125 12:13:14.873554 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qkzhq" podStartSLOduration=3.289901899 podStartE2EDuration="5.873534986s" podCreationTimestamp="2025-11-25 12:13:09 +0000 UTC" firstStartedPulling="2025-11-25 12:13:10.808054576 +0000 UTC m=+310.726139957" lastFinishedPulling="2025-11-25 12:13:13.391687663 +0000 UTC m=+313.309773044" observedRunningTime="2025-11-25 12:13:14.872225578 +0000 UTC m=+314.790310959" watchObservedRunningTime="2025-11-25 12:13:14.873534986 +0000 UTC m=+314.791620377" Nov 25 12:13:17 crc kubenswrapper[4693]: I1125 12:13:17.440887 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:17 crc kubenswrapper[4693]: I1125 12:13:17.441278 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:17 crc kubenswrapper[4693]: I1125 12:13:17.493120 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:17 crc kubenswrapper[4693]: I1125 12:13:17.680671 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:17 crc kubenswrapper[4693]: I1125 12:13:17.680733 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:17 crc kubenswrapper[4693]: I1125 12:13:17.725562 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:17 crc kubenswrapper[4693]: I1125 12:13:17.907220 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 12:13:17 crc kubenswrapper[4693]: I1125 12:13:17.920110 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 12:13:19 crc kubenswrapper[4693]: I1125 12:13:19.855475 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:19 crc kubenswrapper[4693]: I1125 12:13:19.855788 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:19 crc kubenswrapper[4693]: I1125 12:13:19.893978 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:19 crc kubenswrapper[4693]: I1125 12:13:19.940298 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bd7mr" Nov 25 12:13:20 crc kubenswrapper[4693]: I1125 12:13:20.075258 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:20 crc kubenswrapper[4693]: I1125 12:13:20.075316 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:20 crc kubenswrapper[4693]: I1125 12:13:20.120132 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:20 crc kubenswrapper[4693]: I1125 12:13:20.929616 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qkzhq" Nov 25 12:13:31 crc kubenswrapper[4693]: I1125 12:13:31.054366 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" podUID="c3bdbfb7-27fc-41d4-a157-36363c246c38" containerName="oauth-openshift" containerID="cri-o://7c586a16827a9c02f4fc8d83ff6e9b828815ab99ee657a7ddc97e366593c0961" gracePeriod=15 Nov 25 12:13:31 crc kubenswrapper[4693]: I1125 12:13:31.945514 4693 generic.go:334] "Generic (PLEG): container finished" podID="c3bdbfb7-27fc-41d4-a157-36363c246c38" containerID="7c586a16827a9c02f4fc8d83ff6e9b828815ab99ee657a7ddc97e366593c0961" exitCode=0 Nov 25 12:13:31 crc kubenswrapper[4693]: I1125 12:13:31.945558 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" event={"ID":"c3bdbfb7-27fc-41d4-a157-36363c246c38","Type":"ContainerDied","Data":"7c586a16827a9c02f4fc8d83ff6e9b828815ab99ee657a7ddc97e366593c0961"} Nov 25 12:13:31 crc kubenswrapper[4693]: I1125 12:13:31.945835 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" event={"ID":"c3bdbfb7-27fc-41d4-a157-36363c246c38","Type":"ContainerDied","Data":"d842ea447c1ed0961d587012212626adf65513af7fb89f71f898740666671a0c"} Nov 25 12:13:31 crc kubenswrapper[4693]: I1125 12:13:31.945846 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d842ea447c1ed0961d587012212626adf65513af7fb89f71f898740666671a0c" Nov 25 12:13:31 crc kubenswrapper[4693]: I1125 12:13:31.988123 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.010204 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-serving-cert\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.010599 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-provider-selection\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.010638 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-cliconfig\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.012600 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-router-certs\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.012661 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-service-ca\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.012694 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wv4r\" (UniqueName: \"kubernetes.io/projected/c3bdbfb7-27fc-41d4-a157-36363c246c38-kube-api-access-8wv4r\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.012725 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-audit-policies\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.012803 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-idp-0-file-data\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.012887 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-trusted-ca-bundle\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.012950 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-error\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.013018 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-session\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.013063 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-login\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.013796 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-ocp-branding-template\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.013872 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c3bdbfb7-27fc-41d4-a157-36363c246c38-audit-dir\") pod \"c3bdbfb7-27fc-41d4-a157-36363c246c38\" (UID: \"c3bdbfb7-27fc-41d4-a157-36363c246c38\") " Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.014288 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3bdbfb7-27fc-41d4-a157-36363c246c38-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.020145 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.022736 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.023165 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.023497 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.024041 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.029561 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.029958 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.030583 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.033109 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-747bd66b49-n7zjs"] Nov 25 12:13:32 crc kubenswrapper[4693]: E1125 12:13:32.033532 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3bdbfb7-27fc-41d4-a157-36363c246c38" containerName="oauth-openshift" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.033557 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3bdbfb7-27fc-41d4-a157-36363c246c38" containerName="oauth-openshift" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.033749 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3bdbfb7-27fc-41d4-a157-36363c246c38" containerName="oauth-openshift" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.034423 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.036061 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.038050 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-747bd66b49-n7zjs"] Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.038331 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3bdbfb7-27fc-41d4-a157-36363c246c38-kube-api-access-8wv4r" (OuterVolumeSpecName: "kube-api-access-8wv4r") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "kube-api-access-8wv4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.046606 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.049919 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.055662 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c3bdbfb7-27fc-41d4-a157-36363c246c38" (UID: "c3bdbfb7-27fc-41d4-a157-36363c246c38"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.115497 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-cliconfig\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.115610 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg56t\" (UniqueName: \"kubernetes.io/projected/e24f991f-a35b-477d-b41f-e93d6827cfee-kube-api-access-mg56t\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.115634 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-service-ca\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.115650 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-user-template-login\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.115728 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-session\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.115751 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-router-certs\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.115807 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-serving-cert\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.115823 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.115881 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.115918 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e24f991f-a35b-477d-b41f-e93d6827cfee-audit-policies\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.115977 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-user-template-error\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116052 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116069 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116104 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e24f991f-a35b-477d-b41f-e93d6827cfee-audit-dir\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116269 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116282 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116293 4693 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c3bdbfb7-27fc-41d4-a157-36363c246c38-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116302 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116311 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116320 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116348 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116357 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116366 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wv4r\" (UniqueName: \"kubernetes.io/projected/c3bdbfb7-27fc-41d4-a157-36363c246c38-kube-api-access-8wv4r\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116399 4693 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116408 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116416 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116425 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.116434 4693 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c3bdbfb7-27fc-41d4-a157-36363c246c38-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.217779 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-serving-cert\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.217815 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.217845 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.217877 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e24f991f-a35b-477d-b41f-e93d6827cfee-audit-policies\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.217903 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.217919 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-user-template-error\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.217937 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e24f991f-a35b-477d-b41f-e93d6827cfee-audit-dir\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.217953 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.217994 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-cliconfig\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.218013 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg56t\" (UniqueName: \"kubernetes.io/projected/e24f991f-a35b-477d-b41f-e93d6827cfee-kube-api-access-mg56t\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.218030 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-service-ca\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.218046 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-user-template-login\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.218063 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-session\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.218105 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-router-certs\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.218446 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e24f991f-a35b-477d-b41f-e93d6827cfee-audit-dir\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.219202 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.220255 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-cliconfig\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.220764 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e24f991f-a35b-477d-b41f-e93d6827cfee-audit-policies\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.221198 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-router-certs\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.221743 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-service-ca\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.221813 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.221820 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-serving-cert\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.222242 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.223657 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-user-template-error\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.224722 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.225853 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-user-template-login\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.226793 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e24f991f-a35b-477d-b41f-e93d6827cfee-v4-0-config-system-session\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.238030 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg56t\" (UniqueName: \"kubernetes.io/projected/e24f991f-a35b-477d-b41f-e93d6827cfee-kube-api-access-mg56t\") pod \"oauth-openshift-747bd66b49-n7zjs\" (UID: \"e24f991f-a35b-477d-b41f-e93d6827cfee\") " pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.388796 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.591249 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-747bd66b49-n7zjs"] Nov 25 12:13:32 crc kubenswrapper[4693]: W1125 12:13:32.595234 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode24f991f_a35b_477d_b41f_e93d6827cfee.slice/crio-0fcceb38ab2ef163955b74e9ac5bc61640f377f1a0123f3963e798efd79e233b WatchSource:0}: Error finding container 0fcceb38ab2ef163955b74e9ac5bc61640f377f1a0123f3963e798efd79e233b: Status 404 returned error can't find the container with id 0fcceb38ab2ef163955b74e9ac5bc61640f377f1a0123f3963e798efd79e233b Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.953366 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" event={"ID":"e24f991f-a35b-477d-b41f-e93d6827cfee","Type":"ContainerStarted","Data":"0fcceb38ab2ef163955b74e9ac5bc61640f377f1a0123f3963e798efd79e233b"} Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.953421 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zm7pc" Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.977742 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zm7pc"] Nov 25 12:13:32 crc kubenswrapper[4693]: I1125 12:13:32.981268 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zm7pc"] Nov 25 12:13:33 crc kubenswrapper[4693]: I1125 12:13:33.964955 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" event={"ID":"e24f991f-a35b-477d-b41f-e93d6827cfee","Type":"ContainerStarted","Data":"5a6e02c97df36d77e65842cc9de9c4c6652881b563473c23b717dd68d8c0e806"} Nov 25 12:13:33 crc kubenswrapper[4693]: I1125 12:13:33.965420 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:33 crc kubenswrapper[4693]: I1125 12:13:33.977693 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" Nov 25 12:13:34 crc kubenswrapper[4693]: I1125 12:13:34.010843 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-747bd66b49-n7zjs" podStartSLOduration=28.010817644 podStartE2EDuration="28.010817644s" podCreationTimestamp="2025-11-25 12:13:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:13:34.001999965 +0000 UTC m=+333.920085436" watchObservedRunningTime="2025-11-25 12:13:34.010817644 +0000 UTC m=+333.928903035" Nov 25 12:13:34 crc kubenswrapper[4693]: I1125 12:13:34.821020 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3bdbfb7-27fc-41d4-a157-36363c246c38" path="/var/lib/kubelet/pods/c3bdbfb7-27fc-41d4-a157-36363c246c38/volumes" Nov 25 12:14:35 crc kubenswrapper[4693]: I1125 12:14:35.114260 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:14:35 crc kubenswrapper[4693]: I1125 12:14:35.114784 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.134272 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9"] Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.135937 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.138729 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.144492 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.145836 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9"] Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.185214 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-secret-volume\") pod \"collect-profiles-29401215-qv6t9\" (UID: \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.185267 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-config-volume\") pod \"collect-profiles-29401215-qv6t9\" (UID: \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.185351 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkknd\" (UniqueName: \"kubernetes.io/projected/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-kube-api-access-mkknd\") pod \"collect-profiles-29401215-qv6t9\" (UID: \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.287292 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-config-volume\") pod \"collect-profiles-29401215-qv6t9\" (UID: \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.287819 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkknd\" (UniqueName: \"kubernetes.io/projected/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-kube-api-access-mkknd\") pod \"collect-profiles-29401215-qv6t9\" (UID: \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.287914 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-secret-volume\") pod \"collect-profiles-29401215-qv6t9\" (UID: \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.289175 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-config-volume\") pod \"collect-profiles-29401215-qv6t9\" (UID: \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.295473 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-secret-volume\") pod \"collect-profiles-29401215-qv6t9\" (UID: \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.306664 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkknd\" (UniqueName: \"kubernetes.io/projected/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-kube-api-access-mkknd\") pod \"collect-profiles-29401215-qv6t9\" (UID: \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.458430 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:00 crc kubenswrapper[4693]: I1125 12:15:00.859762 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9"] Nov 25 12:15:01 crc kubenswrapper[4693]: I1125 12:15:01.495939 4693 generic.go:334] "Generic (PLEG): container finished" podID="401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d" containerID="095e216724e167ef0c237df79c4ffcb351ce969a7ff1ea062d45cf98bd36f7e4" exitCode=0 Nov 25 12:15:01 crc kubenswrapper[4693]: I1125 12:15:01.496020 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" event={"ID":"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d","Type":"ContainerDied","Data":"095e216724e167ef0c237df79c4ffcb351ce969a7ff1ea062d45cf98bd36f7e4"} Nov 25 12:15:01 crc kubenswrapper[4693]: I1125 12:15:01.496250 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" event={"ID":"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d","Type":"ContainerStarted","Data":"d0f2742b9e7f6f041d215dbf082c98cd0d3f4d86a221c4b3c491b39836d6627c"} Nov 25 12:15:02 crc kubenswrapper[4693]: I1125 12:15:02.790837 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:02 crc kubenswrapper[4693]: I1125 12:15:02.951078 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-config-volume\") pod \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\" (UID: \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\") " Nov 25 12:15:02 crc kubenswrapper[4693]: I1125 12:15:02.951144 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkknd\" (UniqueName: \"kubernetes.io/projected/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-kube-api-access-mkknd\") pod \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\" (UID: \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\") " Nov 25 12:15:02 crc kubenswrapper[4693]: I1125 12:15:02.951213 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-secret-volume\") pod \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\" (UID: \"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d\") " Nov 25 12:15:02 crc kubenswrapper[4693]: I1125 12:15:02.952720 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-config-volume" (OuterVolumeSpecName: "config-volume") pod "401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d" (UID: "401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:15:02 crc kubenswrapper[4693]: I1125 12:15:02.956964 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d" (UID: "401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:15:02 crc kubenswrapper[4693]: I1125 12:15:02.957188 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-kube-api-access-mkknd" (OuterVolumeSpecName: "kube-api-access-mkknd") pod "401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d" (UID: "401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d"). InnerVolumeSpecName "kube-api-access-mkknd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:15:03 crc kubenswrapper[4693]: I1125 12:15:03.052912 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkknd\" (UniqueName: \"kubernetes.io/projected/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-kube-api-access-mkknd\") on node \"crc\" DevicePath \"\"" Nov 25 12:15:03 crc kubenswrapper[4693]: I1125 12:15:03.052957 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 12:15:03 crc kubenswrapper[4693]: I1125 12:15:03.052968 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 12:15:03 crc kubenswrapper[4693]: I1125 12:15:03.510319 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" event={"ID":"401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d","Type":"ContainerDied","Data":"d0f2742b9e7f6f041d215dbf082c98cd0d3f4d86a221c4b3c491b39836d6627c"} Nov 25 12:15:03 crc kubenswrapper[4693]: I1125 12:15:03.510626 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0f2742b9e7f6f041d215dbf082c98cd0d3f4d86a221c4b3c491b39836d6627c" Nov 25 12:15:03 crc kubenswrapper[4693]: I1125 12:15:03.510407 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9" Nov 25 12:15:05 crc kubenswrapper[4693]: I1125 12:15:05.113845 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:15:05 crc kubenswrapper[4693]: I1125 12:15:05.114369 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:15:35 crc kubenswrapper[4693]: I1125 12:15:35.113636 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:15:35 crc kubenswrapper[4693]: I1125 12:15:35.114240 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:15:35 crc kubenswrapper[4693]: I1125 12:15:35.114300 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:15:35 crc kubenswrapper[4693]: I1125 12:15:35.114828 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b88ee2add6c7828542a3cee62632b97ee1acd6379863900fa881c0767075ca70"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 12:15:35 crc kubenswrapper[4693]: I1125 12:15:35.114892 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://b88ee2add6c7828542a3cee62632b97ee1acd6379863900fa881c0767075ca70" gracePeriod=600 Nov 25 12:15:35 crc kubenswrapper[4693]: I1125 12:15:35.704451 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="b88ee2add6c7828542a3cee62632b97ee1acd6379863900fa881c0767075ca70" exitCode=0 Nov 25 12:15:35 crc kubenswrapper[4693]: I1125 12:15:35.705062 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"b88ee2add6c7828542a3cee62632b97ee1acd6379863900fa881c0767075ca70"} Nov 25 12:15:35 crc kubenswrapper[4693]: I1125 12:15:35.705110 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"c8db9f943783b89f4f1f5b7dfdca47ee2c64b3dd3be4d4df26e91fae510d1733"} Nov 25 12:15:35 crc kubenswrapper[4693]: I1125 12:15:35.705130 4693 scope.go:117] "RemoveContainer" containerID="094efe7116ff662092f1109388739a4c9bf9844df919eb2af0c5e25a1148748b" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.206296 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-862xs"] Nov 25 12:15:45 crc kubenswrapper[4693]: E1125 12:15:45.206867 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d" containerName="collect-profiles" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.206884 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d" containerName="collect-profiles" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.206999 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d" containerName="collect-profiles" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.207536 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.225635 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-862xs"] Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.407607 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8cbd3188-9434-4a44-81bc-59a073906e05-trusted-ca\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.407731 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8cbd3188-9434-4a44-81bc-59a073906e05-ca-trust-extracted\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.407812 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bmzd\" (UniqueName: \"kubernetes.io/projected/8cbd3188-9434-4a44-81bc-59a073906e05-kube-api-access-2bmzd\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.407853 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8cbd3188-9434-4a44-81bc-59a073906e05-bound-sa-token\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.407959 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8cbd3188-9434-4a44-81bc-59a073906e05-registry-certificates\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.408014 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.408073 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8cbd3188-9434-4a44-81bc-59a073906e05-registry-tls\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.408125 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8cbd3188-9434-4a44-81bc-59a073906e05-installation-pull-secrets\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.435242 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.509880 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8cbd3188-9434-4a44-81bc-59a073906e05-registry-certificates\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.509975 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8cbd3188-9434-4a44-81bc-59a073906e05-registry-tls\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.510029 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8cbd3188-9434-4a44-81bc-59a073906e05-installation-pull-secrets\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.510086 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8cbd3188-9434-4a44-81bc-59a073906e05-trusted-ca\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.510126 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8cbd3188-9434-4a44-81bc-59a073906e05-ca-trust-extracted\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.510180 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bmzd\" (UniqueName: \"kubernetes.io/projected/8cbd3188-9434-4a44-81bc-59a073906e05-kube-api-access-2bmzd\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.510734 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8cbd3188-9434-4a44-81bc-59a073906e05-bound-sa-token\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.511582 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8cbd3188-9434-4a44-81bc-59a073906e05-ca-trust-extracted\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.513737 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8cbd3188-9434-4a44-81bc-59a073906e05-trusted-ca\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.513997 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8cbd3188-9434-4a44-81bc-59a073906e05-registry-certificates\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.519500 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8cbd3188-9434-4a44-81bc-59a073906e05-registry-tls\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.520514 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8cbd3188-9434-4a44-81bc-59a073906e05-installation-pull-secrets\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.539905 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bmzd\" (UniqueName: \"kubernetes.io/projected/8cbd3188-9434-4a44-81bc-59a073906e05-kube-api-access-2bmzd\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.542147 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8cbd3188-9434-4a44-81bc-59a073906e05-bound-sa-token\") pod \"image-registry-66df7c8f76-862xs\" (UID: \"8cbd3188-9434-4a44-81bc-59a073906e05\") " pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:45 crc kubenswrapper[4693]: I1125 12:15:45.827554 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:46 crc kubenswrapper[4693]: I1125 12:15:46.044137 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-862xs"] Nov 25 12:15:46 crc kubenswrapper[4693]: I1125 12:15:46.773501 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-862xs" event={"ID":"8cbd3188-9434-4a44-81bc-59a073906e05","Type":"ContainerStarted","Data":"1ad0f214b43ba43c84355969bb2dc1b45215073f1787b329cb08f117cd52edc6"} Nov 25 12:15:46 crc kubenswrapper[4693]: I1125 12:15:46.773837 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:15:46 crc kubenswrapper[4693]: I1125 12:15:46.773852 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-862xs" event={"ID":"8cbd3188-9434-4a44-81bc-59a073906e05","Type":"ContainerStarted","Data":"02f37d2d694bcc1b4e2023414ffde19319909938462e07b54e6c21c6673ebc5f"} Nov 25 12:15:46 crc kubenswrapper[4693]: I1125 12:15:46.800472 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-862xs" podStartSLOduration=1.8004500060000002 podStartE2EDuration="1.800450006s" podCreationTimestamp="2025-11-25 12:15:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:15:46.799020789 +0000 UTC m=+466.717106190" watchObservedRunningTime="2025-11-25 12:15:46.800450006 +0000 UTC m=+466.718535407" Nov 25 12:16:05 crc kubenswrapper[4693]: I1125 12:16:05.840204 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-862xs" Nov 25 12:16:05 crc kubenswrapper[4693]: I1125 12:16:05.912757 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cg5pd"] Nov 25 12:16:30 crc kubenswrapper[4693]: I1125 12:16:30.959767 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" podUID="5272f00f-cfb7-49dc-860c-50ec9ee0bd32" containerName="registry" containerID="cri-o://2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd" gracePeriod=30 Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.358634 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.465791 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-registry-certificates\") pod \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.465843 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-bound-sa-token\") pod \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.465879 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-installation-pull-secrets\") pod \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.466072 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.466115 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-ca-trust-extracted\") pod \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.466144 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-trusted-ca\") pod \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.466178 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-registry-tls\") pod \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.466207 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz25z\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-kube-api-access-vz25z\") pod \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\" (UID: \"5272f00f-cfb7-49dc-860c-50ec9ee0bd32\") " Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.467656 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5272f00f-cfb7-49dc-860c-50ec9ee0bd32" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.467735 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5272f00f-cfb7-49dc-860c-50ec9ee0bd32" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.474774 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-kube-api-access-vz25z" (OuterVolumeSpecName: "kube-api-access-vz25z") pod "5272f00f-cfb7-49dc-860c-50ec9ee0bd32" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32"). InnerVolumeSpecName "kube-api-access-vz25z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.475294 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5272f00f-cfb7-49dc-860c-50ec9ee0bd32" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.478522 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5272f00f-cfb7-49dc-860c-50ec9ee0bd32" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.479234 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5272f00f-cfb7-49dc-860c-50ec9ee0bd32" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.479431 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5272f00f-cfb7-49dc-860c-50ec9ee0bd32" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.493445 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5272f00f-cfb7-49dc-860c-50ec9ee0bd32" (UID: "5272f00f-cfb7-49dc-860c-50ec9ee0bd32"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.568278 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz25z\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-kube-api-access-vz25z\") on node \"crc\" DevicePath \"\"" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.568655 4693 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.568675 4693 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.568693 4693 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.568712 4693 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.568730 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:16:31 crc kubenswrapper[4693]: I1125 12:16:31.568748 4693 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5272f00f-cfb7-49dc-860c-50ec9ee0bd32-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:16:32 crc kubenswrapper[4693]: I1125 12:16:32.060478 4693 generic.go:334] "Generic (PLEG): container finished" podID="5272f00f-cfb7-49dc-860c-50ec9ee0bd32" containerID="2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd" exitCode=0 Nov 25 12:16:32 crc kubenswrapper[4693]: I1125 12:16:32.060539 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" event={"ID":"5272f00f-cfb7-49dc-860c-50ec9ee0bd32","Type":"ContainerDied","Data":"2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd"} Nov 25 12:16:32 crc kubenswrapper[4693]: I1125 12:16:32.060551 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" Nov 25 12:16:32 crc kubenswrapper[4693]: I1125 12:16:32.060578 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cg5pd" event={"ID":"5272f00f-cfb7-49dc-860c-50ec9ee0bd32","Type":"ContainerDied","Data":"7280994f7fd51e3c049b5cec16a6229398ca11f816baec84603e9cf67b550efd"} Nov 25 12:16:32 crc kubenswrapper[4693]: I1125 12:16:32.060599 4693 scope.go:117] "RemoveContainer" containerID="2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd" Nov 25 12:16:32 crc kubenswrapper[4693]: I1125 12:16:32.083609 4693 scope.go:117] "RemoveContainer" containerID="2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd" Nov 25 12:16:32 crc kubenswrapper[4693]: E1125 12:16:32.084010 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd\": container with ID starting with 2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd not found: ID does not exist" containerID="2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd" Nov 25 12:16:32 crc kubenswrapper[4693]: I1125 12:16:32.084040 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd"} err="failed to get container status \"2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd\": rpc error: code = NotFound desc = could not find container \"2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd\": container with ID starting with 2066be10907b033da30051ea43e8a54c085e7bb49216099d99a5dabeead335dd not found: ID does not exist" Nov 25 12:16:32 crc kubenswrapper[4693]: I1125 12:16:32.096428 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cg5pd"] Nov 25 12:16:32 crc kubenswrapper[4693]: I1125 12:16:32.099945 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cg5pd"] Nov 25 12:16:32 crc kubenswrapper[4693]: I1125 12:16:32.821855 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5272f00f-cfb7-49dc-860c-50ec9ee0bd32" path="/var/lib/kubelet/pods/5272f00f-cfb7-49dc-860c-50ec9ee0bd32/volumes" Nov 25 12:17:01 crc kubenswrapper[4693]: I1125 12:17:01.004545 4693 scope.go:117] "RemoveContainer" containerID="7c586a16827a9c02f4fc8d83ff6e9b828815ab99ee657a7ddc97e366593c0961" Nov 25 12:17:35 crc kubenswrapper[4693]: I1125 12:17:35.113951 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:17:35 crc kubenswrapper[4693]: I1125 12:17:35.114519 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:18:05 crc kubenswrapper[4693]: I1125 12:18:05.113576 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:18:05 crc kubenswrapper[4693]: I1125 12:18:05.114303 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:18:35 crc kubenswrapper[4693]: I1125 12:18:35.113525 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:18:35 crc kubenswrapper[4693]: I1125 12:18:35.114120 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:18:35 crc kubenswrapper[4693]: I1125 12:18:35.114185 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:18:35 crc kubenswrapper[4693]: I1125 12:18:35.115089 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c8db9f943783b89f4f1f5b7dfdca47ee2c64b3dd3be4d4df26e91fae510d1733"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 12:18:35 crc kubenswrapper[4693]: I1125 12:18:35.115192 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://c8db9f943783b89f4f1f5b7dfdca47ee2c64b3dd3be4d4df26e91fae510d1733" gracePeriod=600 Nov 25 12:18:35 crc kubenswrapper[4693]: I1125 12:18:35.841017 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="c8db9f943783b89f4f1f5b7dfdca47ee2c64b3dd3be4d4df26e91fae510d1733" exitCode=0 Nov 25 12:18:35 crc kubenswrapper[4693]: I1125 12:18:35.841468 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"c8db9f943783b89f4f1f5b7dfdca47ee2c64b3dd3be4d4df26e91fae510d1733"} Nov 25 12:18:35 crc kubenswrapper[4693]: I1125 12:18:35.841496 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"c48b1bd5f615c180301fa268ce0ea0e2b9ab9ea9e6d73443257071ddeda6d194"} Nov 25 12:18:35 crc kubenswrapper[4693]: I1125 12:18:35.841511 4693 scope.go:117] "RemoveContainer" containerID="b88ee2add6c7828542a3cee62632b97ee1acd6379863900fa881c0767075ca70" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.457214 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-krwsc"] Nov 25 12:18:39 crc kubenswrapper[4693]: E1125 12:18:39.458073 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5272f00f-cfb7-49dc-860c-50ec9ee0bd32" containerName="registry" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.458092 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5272f00f-cfb7-49dc-860c-50ec9ee0bd32" containerName="registry" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.458241 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5272f00f-cfb7-49dc-860c-50ec9ee0bd32" containerName="registry" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.458638 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-krwsc" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.462066 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.462262 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.462438 4693 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-djldv" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.469424 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-zc5h2"] Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.470172 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-zc5h2" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.474242 4693 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-snzcd" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.484205 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-94qfr"] Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.485414 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-94qfr" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.487711 4693 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-zlsqm" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.501132 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-94qfr"] Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.516946 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-krwsc"] Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.517218 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w4v7\" (UniqueName: \"kubernetes.io/projected/886fc2dd-e1c6-4822-b516-1540c9e77f39-kube-api-access-9w4v7\") pod \"cert-manager-cainjector-7f985d654d-krwsc\" (UID: \"886fc2dd-e1c6-4822-b516-1540c9e77f39\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-krwsc" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.517277 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzvb6\" (UniqueName: \"kubernetes.io/projected/6fff8a61-6848-4e20-bc9b-cc0d8e4299d4-kube-api-access-tzvb6\") pod \"cert-manager-webhook-5655c58dd6-94qfr\" (UID: \"6fff8a61-6848-4e20-bc9b-cc0d8e4299d4\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-94qfr" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.517320 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-255wz\" (UniqueName: \"kubernetes.io/projected/b9284e69-f82a-44ea-bee3-627c08d1d86c-kube-api-access-255wz\") pod \"cert-manager-5b446d88c5-zc5h2\" (UID: \"b9284e69-f82a-44ea-bee3-627c08d1d86c\") " pod="cert-manager/cert-manager-5b446d88c5-zc5h2" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.543601 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-zc5h2"] Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.618137 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w4v7\" (UniqueName: \"kubernetes.io/projected/886fc2dd-e1c6-4822-b516-1540c9e77f39-kube-api-access-9w4v7\") pod \"cert-manager-cainjector-7f985d654d-krwsc\" (UID: \"886fc2dd-e1c6-4822-b516-1540c9e77f39\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-krwsc" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.618203 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzvb6\" (UniqueName: \"kubernetes.io/projected/6fff8a61-6848-4e20-bc9b-cc0d8e4299d4-kube-api-access-tzvb6\") pod \"cert-manager-webhook-5655c58dd6-94qfr\" (UID: \"6fff8a61-6848-4e20-bc9b-cc0d8e4299d4\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-94qfr" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.618231 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-255wz\" (UniqueName: \"kubernetes.io/projected/b9284e69-f82a-44ea-bee3-627c08d1d86c-kube-api-access-255wz\") pod \"cert-manager-5b446d88c5-zc5h2\" (UID: \"b9284e69-f82a-44ea-bee3-627c08d1d86c\") " pod="cert-manager/cert-manager-5b446d88c5-zc5h2" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.637570 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-255wz\" (UniqueName: \"kubernetes.io/projected/b9284e69-f82a-44ea-bee3-627c08d1d86c-kube-api-access-255wz\") pod \"cert-manager-5b446d88c5-zc5h2\" (UID: \"b9284e69-f82a-44ea-bee3-627c08d1d86c\") " pod="cert-manager/cert-manager-5b446d88c5-zc5h2" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.637718 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w4v7\" (UniqueName: \"kubernetes.io/projected/886fc2dd-e1c6-4822-b516-1540c9e77f39-kube-api-access-9w4v7\") pod \"cert-manager-cainjector-7f985d654d-krwsc\" (UID: \"886fc2dd-e1c6-4822-b516-1540c9e77f39\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-krwsc" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.640445 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzvb6\" (UniqueName: \"kubernetes.io/projected/6fff8a61-6848-4e20-bc9b-cc0d8e4299d4-kube-api-access-tzvb6\") pod \"cert-manager-webhook-5655c58dd6-94qfr\" (UID: \"6fff8a61-6848-4e20-bc9b-cc0d8e4299d4\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-94qfr" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.782535 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-krwsc" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.799649 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-zc5h2" Nov 25 12:18:39 crc kubenswrapper[4693]: I1125 12:18:39.808221 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-94qfr" Nov 25 12:18:40 crc kubenswrapper[4693]: I1125 12:18:40.038256 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-zc5h2"] Nov 25 12:18:40 crc kubenswrapper[4693]: W1125 12:18:40.054574 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9284e69_f82a_44ea_bee3_627c08d1d86c.slice/crio-74e7e865d5ddf09361a5d08cd175b2bd3e9ce89c1c17392bbeeaa3117ef57ddd WatchSource:0}: Error finding container 74e7e865d5ddf09361a5d08cd175b2bd3e9ce89c1c17392bbeeaa3117ef57ddd: Status 404 returned error can't find the container with id 74e7e865d5ddf09361a5d08cd175b2bd3e9ce89c1c17392bbeeaa3117ef57ddd Nov 25 12:18:40 crc kubenswrapper[4693]: I1125 12:18:40.057216 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 12:18:40 crc kubenswrapper[4693]: I1125 12:18:40.100747 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-94qfr"] Nov 25 12:18:40 crc kubenswrapper[4693]: W1125 12:18:40.114436 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fff8a61_6848_4e20_bc9b_cc0d8e4299d4.slice/crio-701e51eeee81b5c9dfa836101bce9e30e839e778148ae204eb7692d1c6c58992 WatchSource:0}: Error finding container 701e51eeee81b5c9dfa836101bce9e30e839e778148ae204eb7692d1c6c58992: Status 404 returned error can't find the container with id 701e51eeee81b5c9dfa836101bce9e30e839e778148ae204eb7692d1c6c58992 Nov 25 12:18:40 crc kubenswrapper[4693]: I1125 12:18:40.199786 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-krwsc"] Nov 25 12:18:40 crc kubenswrapper[4693]: W1125 12:18:40.206968 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod886fc2dd_e1c6_4822_b516_1540c9e77f39.slice/crio-b015690b2e4c20160d85e931b6d6c9636bc819cf7e5587d98dfbdc3450679bcb WatchSource:0}: Error finding container b015690b2e4c20160d85e931b6d6c9636bc819cf7e5587d98dfbdc3450679bcb: Status 404 returned error can't find the container with id b015690b2e4c20160d85e931b6d6c9636bc819cf7e5587d98dfbdc3450679bcb Nov 25 12:18:40 crc kubenswrapper[4693]: I1125 12:18:40.873278 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-zc5h2" event={"ID":"b9284e69-f82a-44ea-bee3-627c08d1d86c","Type":"ContainerStarted","Data":"74e7e865d5ddf09361a5d08cd175b2bd3e9ce89c1c17392bbeeaa3117ef57ddd"} Nov 25 12:18:40 crc kubenswrapper[4693]: I1125 12:18:40.874283 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-krwsc" event={"ID":"886fc2dd-e1c6-4822-b516-1540c9e77f39","Type":"ContainerStarted","Data":"b015690b2e4c20160d85e931b6d6c9636bc819cf7e5587d98dfbdc3450679bcb"} Nov 25 12:18:40 crc kubenswrapper[4693]: I1125 12:18:40.875485 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-94qfr" event={"ID":"6fff8a61-6848-4e20-bc9b-cc0d8e4299d4","Type":"ContainerStarted","Data":"701e51eeee81b5c9dfa836101bce9e30e839e778148ae204eb7692d1c6c58992"} Nov 25 12:18:43 crc kubenswrapper[4693]: I1125 12:18:43.907433 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-zc5h2" event={"ID":"b9284e69-f82a-44ea-bee3-627c08d1d86c","Type":"ContainerStarted","Data":"b5f15aa911e05de4d96d4932e54ecf8147f08fd8ab6c099689dfb7bd36578670"} Nov 25 12:18:43 crc kubenswrapper[4693]: I1125 12:18:43.909829 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-krwsc" event={"ID":"886fc2dd-e1c6-4822-b516-1540c9e77f39","Type":"ContainerStarted","Data":"bb4263ecb7b497deddc68a0dc14305da446c89c7f0dcbc4a5a09a6d6bb5e715e"} Nov 25 12:18:43 crc kubenswrapper[4693]: I1125 12:18:43.912063 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-94qfr" event={"ID":"6fff8a61-6848-4e20-bc9b-cc0d8e4299d4","Type":"ContainerStarted","Data":"dbb8dfe7b74d8b091888865e6a69fb3410f406f1ab3ea6a947c2bf102245471d"} Nov 25 12:18:43 crc kubenswrapper[4693]: I1125 12:18:43.938108 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-zc5h2" podStartSLOduration=1.627910389 podStartE2EDuration="4.938081625s" podCreationTimestamp="2025-11-25 12:18:39 +0000 UTC" firstStartedPulling="2025-11-25 12:18:40.056742572 +0000 UTC m=+639.974827953" lastFinishedPulling="2025-11-25 12:18:43.366913808 +0000 UTC m=+643.284999189" observedRunningTime="2025-11-25 12:18:43.928948738 +0000 UTC m=+643.847034199" watchObservedRunningTime="2025-11-25 12:18:43.938081625 +0000 UTC m=+643.856167046" Nov 25 12:18:43 crc kubenswrapper[4693]: I1125 12:18:43.971748 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-94qfr" podStartSLOduration=1.660989416 podStartE2EDuration="4.971715877s" podCreationTimestamp="2025-11-25 12:18:39 +0000 UTC" firstStartedPulling="2025-11-25 12:18:40.121859983 +0000 UTC m=+640.039945354" lastFinishedPulling="2025-11-25 12:18:43.432586424 +0000 UTC m=+643.350671815" observedRunningTime="2025-11-25 12:18:43.955902845 +0000 UTC m=+643.873988236" watchObservedRunningTime="2025-11-25 12:18:43.971715877 +0000 UTC m=+643.889801298" Nov 25 12:18:43 crc kubenswrapper[4693]: I1125 12:18:43.972788 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-krwsc" podStartSLOduration=1.800383726 podStartE2EDuration="4.972773923s" podCreationTimestamp="2025-11-25 12:18:39 +0000 UTC" firstStartedPulling="2025-11-25 12:18:40.209269736 +0000 UTC m=+640.127355117" lastFinishedPulling="2025-11-25 12:18:43.381659943 +0000 UTC m=+643.299745314" observedRunningTime="2025-11-25 12:18:43.969932992 +0000 UTC m=+643.888018383" watchObservedRunningTime="2025-11-25 12:18:43.972773923 +0000 UTC m=+643.890859344" Nov 25 12:18:44 crc kubenswrapper[4693]: I1125 12:18:44.809316 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-94qfr" Nov 25 12:18:49 crc kubenswrapper[4693]: I1125 12:18:49.799567 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sn9jm"] Nov 25 12:18:49 crc kubenswrapper[4693]: I1125 12:18:49.800413 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovn-controller" containerID="cri-o://8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0" gracePeriod=30 Nov 25 12:18:49 crc kubenswrapper[4693]: I1125 12:18:49.800526 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="sbdb" containerID="cri-o://55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1" gracePeriod=30 Nov 25 12:18:49 crc kubenswrapper[4693]: I1125 12:18:49.800557 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252" gracePeriod=30 Nov 25 12:18:49 crc kubenswrapper[4693]: I1125 12:18:49.800544 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="northd" containerID="cri-o://72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f" gracePeriod=30 Nov 25 12:18:49 crc kubenswrapper[4693]: I1125 12:18:49.800643 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="kube-rbac-proxy-node" containerID="cri-o://709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be" gracePeriod=30 Nov 25 12:18:49 crc kubenswrapper[4693]: I1125 12:18:49.800655 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovn-acl-logging" containerID="cri-o://f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae" gracePeriod=30 Nov 25 12:18:49 crc kubenswrapper[4693]: I1125 12:18:49.800420 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="nbdb" containerID="cri-o://3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc" gracePeriod=30 Nov 25 12:18:49 crc kubenswrapper[4693]: I1125 12:18:49.812450 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-94qfr" Nov 25 12:18:49 crc kubenswrapper[4693]: I1125 12:18:49.887946 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" containerID="cri-o://bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883" gracePeriod=30 Nov 25 12:18:49 crc kubenswrapper[4693]: I1125 12:18:49.985028 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/3.log" Nov 25 12:18:49 crc kubenswrapper[4693]: I1125 12:18:49.995720 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovn-acl-logging/0.log" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.000812 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovn-controller/0.log" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.004678 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252" exitCode=0 Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.004714 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae" exitCode=143 Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.004725 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0" exitCode=143 Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.004749 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252"} Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.004782 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae"} Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.004796 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0"} Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.159814 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/3.log" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.161946 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovn-acl-logging/0.log" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.162448 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovn-controller/0.log" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.162810 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212496 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pztc6"] Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.212692 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="nbdb" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212707 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="nbdb" Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.212719 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212728 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.212737 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovn-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212744 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovn-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.212751 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="kube-rbac-proxy-node" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212757 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="kube-rbac-proxy-node" Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.212764 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212770 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.212779 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212785 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.212794 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212800 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.212806 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovn-acl-logging" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212812 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovn-acl-logging" Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.212819 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="northd" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212824 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="northd" Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.212834 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="kubecfg-setup" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212839 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="kubecfg-setup" Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.212848 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="sbdb" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212854 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="sbdb" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212938 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="nbdb" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212948 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212955 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="kube-rbac-proxy-node" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212965 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="northd" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212973 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="sbdb" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212981 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovn-acl-logging" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212988 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="kube-rbac-proxy-ovn-metrics" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.212995 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.213001 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovn-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.213011 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.213018 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.213102 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.213109 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: E1125 12:18:50.213121 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.213127 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.213224 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerName="ovnkube-controller" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.214721 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.277596 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-openvswitch\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.277655 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-ovn\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.277745 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.277761 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.277809 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-cni-netd\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.277860 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.277893 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-node-log\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.277927 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-ovnkube-config\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.277948 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-var-lib-openvswitch\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.277975 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278020 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-node-log" (OuterVolumeSpecName: "node-log") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278155 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-kubelet\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278193 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-systemd-units\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278221 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278242 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278244 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-log-socket\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278261 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278267 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-log-socket" (OuterVolumeSpecName: "log-socket") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278287 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278298 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-run-ovn-kubernetes\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278329 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278334 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-env-overrides\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278445 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-cni-bin\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278467 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-slash\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278492 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-ovnkube-script-lib\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278500 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278524 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c247f7d-6187-4052-baee-5c5841e1d9da-ovn-node-metrics-cert\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278543 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-slash" (OuterVolumeSpecName: "host-slash") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278550 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-run-netns\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278573 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-etc-openvswitch\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278575 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278600 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlfn9\" (UniqueName: \"kubernetes.io/projected/4c247f7d-6187-4052-baee-5c5841e1d9da-kube-api-access-dlfn9\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278632 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-systemd\") pod \"4c247f7d-6187-4052-baee-5c5841e1d9da\" (UID: \"4c247f7d-6187-4052-baee-5c5841e1d9da\") " Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278720 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-cni-netd\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278750 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-run-netns\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278762 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278812 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278830 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278777 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-log-socket\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278924 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/713d7d21-0b1a-4dac-a300-4974106541a7-ovnkube-script-lib\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278968 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.278989 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d7mx\" (UniqueName: \"kubernetes.io/projected/713d7d21-0b1a-4dac-a300-4974106541a7-kube-api-access-4d7mx\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279010 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279030 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-etc-openvswitch\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279085 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-cni-bin\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279119 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-kubelet\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279155 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-node-log\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279183 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/713d7d21-0b1a-4dac-a300-4974106541a7-env-overrides\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279240 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-run-ovn\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279269 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/713d7d21-0b1a-4dac-a300-4974106541a7-ovn-node-metrics-cert\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279307 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-run-openvswitch\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279353 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-systemd-units\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279405 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-run-systemd\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279435 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279471 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-var-lib-openvswitch\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279502 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/713d7d21-0b1a-4dac-a300-4974106541a7-ovnkube-config\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279641 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-slash\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279790 4693 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279807 4693 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-node-log\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279820 4693 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279832 4693 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279843 4693 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279855 4693 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279867 4693 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279878 4693 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-log-socket\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279891 4693 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279905 4693 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279915 4693 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279926 4693 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-slash\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279938 4693 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4c247f7d-6187-4052-baee-5c5841e1d9da-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279950 4693 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279961 4693 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279974 4693 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.279985 4693 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.283827 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c247f7d-6187-4052-baee-5c5841e1d9da-kube-api-access-dlfn9" (OuterVolumeSpecName: "kube-api-access-dlfn9") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "kube-api-access-dlfn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.284023 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c247f7d-6187-4052-baee-5c5841e1d9da-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.291132 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4c247f7d-6187-4052-baee-5c5841e1d9da" (UID: "4c247f7d-6187-4052-baee-5c5841e1d9da"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.380969 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-cni-bin\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381023 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-kubelet\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381053 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-node-log\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381075 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/713d7d21-0b1a-4dac-a300-4974106541a7-env-overrides\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381092 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-run-ovn\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381088 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-cni-bin\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381110 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/713d7d21-0b1a-4dac-a300-4974106541a7-ovn-node-metrics-cert\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381176 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-run-openvswitch\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381214 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-systemd-units\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381237 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-run-systemd\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381233 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-kubelet\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381252 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-run-ovn\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381259 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381281 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381267 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-node-log\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381301 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-run-openvswitch\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381324 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-run-systemd\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381350 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-var-lib-openvswitch\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381393 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/713d7d21-0b1a-4dac-a300-4974106541a7-ovnkube-config\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381411 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-slash\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381425 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-var-lib-openvswitch\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381443 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-cni-netd\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381468 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-run-netns\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381474 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-slash\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381491 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-log-socket\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381502 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-cni-netd\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381520 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-run-netns\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381530 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/713d7d21-0b1a-4dac-a300-4974106541a7-ovnkube-script-lib\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381552 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-log-socket\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381573 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381599 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d7mx\" (UniqueName: \"kubernetes.io/projected/713d7d21-0b1a-4dac-a300-4974106541a7-kube-api-access-4d7mx\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381633 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-etc-openvswitch\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381682 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlfn9\" (UniqueName: \"kubernetes.io/projected/4c247f7d-6187-4052-baee-5c5841e1d9da-kube-api-access-dlfn9\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381692 4693 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4c247f7d-6187-4052-baee-5c5841e1d9da-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381703 4693 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4c247f7d-6187-4052-baee-5c5841e1d9da-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.381729 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-etc-openvswitch\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.382014 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-systemd-units\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.382024 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/713d7d21-0b1a-4dac-a300-4974106541a7-env-overrides\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.382027 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/713d7d21-0b1a-4dac-a300-4974106541a7-host-run-ovn-kubernetes\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.382153 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/713d7d21-0b1a-4dac-a300-4974106541a7-ovnkube-config\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.382475 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/713d7d21-0b1a-4dac-a300-4974106541a7-ovnkube-script-lib\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.385070 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/713d7d21-0b1a-4dac-a300-4974106541a7-ovn-node-metrics-cert\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.405961 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d7mx\" (UniqueName: \"kubernetes.io/projected/713d7d21-0b1a-4dac-a300-4974106541a7-kube-api-access-4d7mx\") pod \"ovnkube-node-pztc6\" (UID: \"713d7d21-0b1a-4dac-a300-4974106541a7\") " pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:50 crc kubenswrapper[4693]: I1125 12:18:50.528011 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.011675 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6l9jx_f714b419-cf37-48b7-9b1a-d36291d788a0/kube-multus/2.log" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.012659 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6l9jx_f714b419-cf37-48b7-9b1a-d36291d788a0/kube-multus/1.log" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.012721 4693 generic.go:334] "Generic (PLEG): container finished" podID="f714b419-cf37-48b7-9b1a-d36291d788a0" containerID="341e4691994213e289b6b6687d57d43f5c4a8981a11aa8daf3be474b270d87f7" exitCode=2 Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.012806 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6l9jx" event={"ID":"f714b419-cf37-48b7-9b1a-d36291d788a0","Type":"ContainerDied","Data":"341e4691994213e289b6b6687d57d43f5c4a8981a11aa8daf3be474b270d87f7"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.012845 4693 scope.go:117] "RemoveContainer" containerID="382211ae43e333d7bb7c5f1a1ab9556b12e5b61664925168b887ab596f56a486" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.013395 4693 scope.go:117] "RemoveContainer" containerID="341e4691994213e289b6b6687d57d43f5c4a8981a11aa8daf3be474b270d87f7" Nov 25 12:18:51 crc kubenswrapper[4693]: E1125 12:18:51.013554 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6l9jx_openshift-multus(f714b419-cf37-48b7-9b1a-d36291d788a0)\"" pod="openshift-multus/multus-6l9jx" podUID="f714b419-cf37-48b7-9b1a-d36291d788a0" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.017887 4693 generic.go:334] "Generic (PLEG): container finished" podID="713d7d21-0b1a-4dac-a300-4974106541a7" containerID="7aa92872a33ed00302e8a30d52caa4cafaa829b199cbea9495d5a5c9b85023ad" exitCode=0 Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.017941 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" event={"ID":"713d7d21-0b1a-4dac-a300-4974106541a7","Type":"ContainerDied","Data":"7aa92872a33ed00302e8a30d52caa4cafaa829b199cbea9495d5a5c9b85023ad"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.017961 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" event={"ID":"713d7d21-0b1a-4dac-a300-4974106541a7","Type":"ContainerStarted","Data":"ccd41bd41c629fb5f4067c76661a1a754f0569293ec122704a494b9ec89af86a"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.023256 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovnkube-controller/3.log" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.025650 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovn-acl-logging/0.log" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026226 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sn9jm_4c247f7d-6187-4052-baee-5c5841e1d9da/ovn-controller/0.log" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026637 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883" exitCode=0 Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026661 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1" exitCode=0 Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026669 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc" exitCode=0 Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026675 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f" exitCode=0 Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026681 4693 generic.go:334] "Generic (PLEG): container finished" podID="4c247f7d-6187-4052-baee-5c5841e1d9da" containerID="709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be" exitCode=0 Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026700 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026723 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026733 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026741 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026752 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026760 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" event={"ID":"4c247f7d-6187-4052-baee-5c5841e1d9da","Type":"ContainerDied","Data":"91900e5c13c852ec9260b6bedac82c3a81d05e42bece88af1e7b8adace084239"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026770 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026780 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026785 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026790 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026795 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026799 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026804 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026809 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026814 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026819 4693 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778"} Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.026895 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sn9jm" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.079533 4693 scope.go:117] "RemoveContainer" containerID="bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.098590 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sn9jm"] Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.102635 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sn9jm"] Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.103285 4693 scope.go:117] "RemoveContainer" containerID="06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.135500 4693 scope.go:117] "RemoveContainer" containerID="55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.151867 4693 scope.go:117] "RemoveContainer" containerID="3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.168591 4693 scope.go:117] "RemoveContainer" containerID="72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.182315 4693 scope.go:117] "RemoveContainer" containerID="d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.197472 4693 scope.go:117] "RemoveContainer" containerID="709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.224864 4693 scope.go:117] "RemoveContainer" containerID="f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.241879 4693 scope.go:117] "RemoveContainer" containerID="8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.265848 4693 scope.go:117] "RemoveContainer" containerID="a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.287555 4693 scope.go:117] "RemoveContainer" containerID="bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883" Nov 25 12:18:51 crc kubenswrapper[4693]: E1125 12:18:51.288412 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883\": container with ID starting with bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883 not found: ID does not exist" containerID="bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.288460 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883"} err="failed to get container status \"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883\": rpc error: code = NotFound desc = could not find container \"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883\": container with ID starting with bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.288497 4693 scope.go:117] "RemoveContainer" containerID="06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0" Nov 25 12:18:51 crc kubenswrapper[4693]: E1125 12:18:51.288896 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\": container with ID starting with 06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0 not found: ID does not exist" containerID="06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.288941 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0"} err="failed to get container status \"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\": rpc error: code = NotFound desc = could not find container \"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\": container with ID starting with 06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.288981 4693 scope.go:117] "RemoveContainer" containerID="55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1" Nov 25 12:18:51 crc kubenswrapper[4693]: E1125 12:18:51.289619 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\": container with ID starting with 55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1 not found: ID does not exist" containerID="55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.289641 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1"} err="failed to get container status \"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\": rpc error: code = NotFound desc = could not find container \"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\": container with ID starting with 55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.289657 4693 scope.go:117] "RemoveContainer" containerID="3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc" Nov 25 12:18:51 crc kubenswrapper[4693]: E1125 12:18:51.290551 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\": container with ID starting with 3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc not found: ID does not exist" containerID="3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.290633 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc"} err="failed to get container status \"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\": rpc error: code = NotFound desc = could not find container \"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\": container with ID starting with 3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.290690 4693 scope.go:117] "RemoveContainer" containerID="72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f" Nov 25 12:18:51 crc kubenswrapper[4693]: E1125 12:18:51.292817 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\": container with ID starting with 72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f not found: ID does not exist" containerID="72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.292842 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f"} err="failed to get container status \"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\": rpc error: code = NotFound desc = could not find container \"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\": container with ID starting with 72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.292872 4693 scope.go:117] "RemoveContainer" containerID="d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252" Nov 25 12:18:51 crc kubenswrapper[4693]: E1125 12:18:51.293171 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\": container with ID starting with d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252 not found: ID does not exist" containerID="d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.293190 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252"} err="failed to get container status \"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\": rpc error: code = NotFound desc = could not find container \"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\": container with ID starting with d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.293206 4693 scope.go:117] "RemoveContainer" containerID="709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be" Nov 25 12:18:51 crc kubenswrapper[4693]: E1125 12:18:51.293492 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\": container with ID starting with 709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be not found: ID does not exist" containerID="709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.293521 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be"} err="failed to get container status \"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\": rpc error: code = NotFound desc = could not find container \"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\": container with ID starting with 709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.293544 4693 scope.go:117] "RemoveContainer" containerID="f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae" Nov 25 12:18:51 crc kubenswrapper[4693]: E1125 12:18:51.294749 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\": container with ID starting with f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae not found: ID does not exist" containerID="f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.294803 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae"} err="failed to get container status \"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\": rpc error: code = NotFound desc = could not find container \"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\": container with ID starting with f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.294981 4693 scope.go:117] "RemoveContainer" containerID="8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0" Nov 25 12:18:51 crc kubenswrapper[4693]: E1125 12:18:51.297063 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\": container with ID starting with 8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0 not found: ID does not exist" containerID="8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.297100 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0"} err="failed to get container status \"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\": rpc error: code = NotFound desc = could not find container \"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\": container with ID starting with 8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.297130 4693 scope.go:117] "RemoveContainer" containerID="a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778" Nov 25 12:18:51 crc kubenswrapper[4693]: E1125 12:18:51.298543 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\": container with ID starting with a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778 not found: ID does not exist" containerID="a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.298731 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778"} err="failed to get container status \"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\": rpc error: code = NotFound desc = could not find container \"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\": container with ID starting with a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.298755 4693 scope.go:117] "RemoveContainer" containerID="bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.299035 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883"} err="failed to get container status \"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883\": rpc error: code = NotFound desc = could not find container \"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883\": container with ID starting with bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.299057 4693 scope.go:117] "RemoveContainer" containerID="06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.299387 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0"} err="failed to get container status \"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\": rpc error: code = NotFound desc = could not find container \"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\": container with ID starting with 06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.299421 4693 scope.go:117] "RemoveContainer" containerID="55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.299685 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1"} err="failed to get container status \"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\": rpc error: code = NotFound desc = could not find container \"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\": container with ID starting with 55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.299708 4693 scope.go:117] "RemoveContainer" containerID="3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.300125 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc"} err="failed to get container status \"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\": rpc error: code = NotFound desc = could not find container \"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\": container with ID starting with 3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.300150 4693 scope.go:117] "RemoveContainer" containerID="72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.300395 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f"} err="failed to get container status \"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\": rpc error: code = NotFound desc = could not find container \"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\": container with ID starting with 72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.300423 4693 scope.go:117] "RemoveContainer" containerID="d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.300848 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252"} err="failed to get container status \"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\": rpc error: code = NotFound desc = could not find container \"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\": container with ID starting with d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.300880 4693 scope.go:117] "RemoveContainer" containerID="709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.301143 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be"} err="failed to get container status \"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\": rpc error: code = NotFound desc = could not find container \"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\": container with ID starting with 709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.301164 4693 scope.go:117] "RemoveContainer" containerID="f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.301428 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae"} err="failed to get container status \"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\": rpc error: code = NotFound desc = could not find container \"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\": container with ID starting with f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.301467 4693 scope.go:117] "RemoveContainer" containerID="8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.301859 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0"} err="failed to get container status \"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\": rpc error: code = NotFound desc = could not find container \"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\": container with ID starting with 8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.301877 4693 scope.go:117] "RemoveContainer" containerID="a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.302128 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778"} err="failed to get container status \"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\": rpc error: code = NotFound desc = could not find container \"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\": container with ID starting with a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.302146 4693 scope.go:117] "RemoveContainer" containerID="bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.303151 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883"} err="failed to get container status \"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883\": rpc error: code = NotFound desc = could not find container \"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883\": container with ID starting with bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.303177 4693 scope.go:117] "RemoveContainer" containerID="06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.303394 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0"} err="failed to get container status \"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\": rpc error: code = NotFound desc = could not find container \"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\": container with ID starting with 06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.303419 4693 scope.go:117] "RemoveContainer" containerID="55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.303885 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1"} err="failed to get container status \"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\": rpc error: code = NotFound desc = could not find container \"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\": container with ID starting with 55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.303927 4693 scope.go:117] "RemoveContainer" containerID="3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.304876 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc"} err="failed to get container status \"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\": rpc error: code = NotFound desc = could not find container \"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\": container with ID starting with 3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.304901 4693 scope.go:117] "RemoveContainer" containerID="72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.305158 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f"} err="failed to get container status \"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\": rpc error: code = NotFound desc = could not find container \"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\": container with ID starting with 72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.305180 4693 scope.go:117] "RemoveContainer" containerID="d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.305360 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252"} err="failed to get container status \"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\": rpc error: code = NotFound desc = could not find container \"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\": container with ID starting with d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.305399 4693 scope.go:117] "RemoveContainer" containerID="709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.305582 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be"} err="failed to get container status \"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\": rpc error: code = NotFound desc = could not find container \"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\": container with ID starting with 709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.305603 4693 scope.go:117] "RemoveContainer" containerID="f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.306513 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae"} err="failed to get container status \"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\": rpc error: code = NotFound desc = could not find container \"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\": container with ID starting with f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.306551 4693 scope.go:117] "RemoveContainer" containerID="8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.306900 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0"} err="failed to get container status \"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\": rpc error: code = NotFound desc = could not find container \"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\": container with ID starting with 8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.306931 4693 scope.go:117] "RemoveContainer" containerID="a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.307167 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778"} err="failed to get container status \"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\": rpc error: code = NotFound desc = could not find container \"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\": container with ID starting with a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.307190 4693 scope.go:117] "RemoveContainer" containerID="bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.307470 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883"} err="failed to get container status \"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883\": rpc error: code = NotFound desc = could not find container \"bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883\": container with ID starting with bf4c51a654480a87cacc5780059581c97cfd2b31bf412de54837d282d3702883 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.307491 4693 scope.go:117] "RemoveContainer" containerID="06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.307722 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0"} err="failed to get container status \"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\": rpc error: code = NotFound desc = could not find container \"06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0\": container with ID starting with 06dd4a1becad6d8e864f14c5e7b503358fa5cc0d12ada0d9c46e2b402b4a90e0 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.307780 4693 scope.go:117] "RemoveContainer" containerID="55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.308043 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1"} err="failed to get container status \"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\": rpc error: code = NotFound desc = could not find container \"55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1\": container with ID starting with 55fef696a8bdb6f863c97746fc825a788b2c355c9a940b88e3d22b6c5cb90cf1 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.308065 4693 scope.go:117] "RemoveContainer" containerID="3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.308310 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc"} err="failed to get container status \"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\": rpc error: code = NotFound desc = could not find container \"3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc\": container with ID starting with 3de8d575e7bd6698ca59a1f15670b9fb2200b891817e758e8431f387cf09bcbc not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.308333 4693 scope.go:117] "RemoveContainer" containerID="72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.308576 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f"} err="failed to get container status \"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\": rpc error: code = NotFound desc = could not find container \"72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f\": container with ID starting with 72f5d38e017b43bb964a42aadfa74c3a9e3df62be727c4cdfdd7b5366858732f not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.308596 4693 scope.go:117] "RemoveContainer" containerID="d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.308815 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252"} err="failed to get container status \"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\": rpc error: code = NotFound desc = could not find container \"d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252\": container with ID starting with d5183b14036b967d6919422f1b20e78821c98a4868684d1a8df11c816f128252 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.308839 4693 scope.go:117] "RemoveContainer" containerID="709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.309037 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be"} err="failed to get container status \"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\": rpc error: code = NotFound desc = could not find container \"709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be\": container with ID starting with 709c23932996b69ed2d51d48da07aee141f9852f664c9f872ebc82a0cf07d9be not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.309061 4693 scope.go:117] "RemoveContainer" containerID="f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.309252 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae"} err="failed to get container status \"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\": rpc error: code = NotFound desc = could not find container \"f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae\": container with ID starting with f7ce0b38c7e974acc956ba69a991f93cc92d8673b8b8ec6272e553952b677cae not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.309275 4693 scope.go:117] "RemoveContainer" containerID="8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.309484 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0"} err="failed to get container status \"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\": rpc error: code = NotFound desc = could not find container \"8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0\": container with ID starting with 8d58b4f707be4ceac9bdda5956530edf332ac70698e95870f53152825cde17a0 not found: ID does not exist" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.309511 4693 scope.go:117] "RemoveContainer" containerID="a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778" Nov 25 12:18:51 crc kubenswrapper[4693]: I1125 12:18:51.309751 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778"} err="failed to get container status \"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\": rpc error: code = NotFound desc = could not find container \"a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778\": container with ID starting with a6f278c8b0b0fdd63ff6972dc44ad8de31575eef2069855c7c4b43d54581d778 not found: ID does not exist" Nov 25 12:18:52 crc kubenswrapper[4693]: I1125 12:18:52.034089 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6l9jx_f714b419-cf37-48b7-9b1a-d36291d788a0/kube-multus/2.log" Nov 25 12:18:52 crc kubenswrapper[4693]: I1125 12:18:52.040314 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" event={"ID":"713d7d21-0b1a-4dac-a300-4974106541a7","Type":"ContainerStarted","Data":"5e2ce65be105a754b774501e471f4dea2ea0a76d559793a67eceb9f89e30463b"} Nov 25 12:18:52 crc kubenswrapper[4693]: I1125 12:18:52.040425 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" event={"ID":"713d7d21-0b1a-4dac-a300-4974106541a7","Type":"ContainerStarted","Data":"08ded2bcb639f1adc1e646d714181e8987ebb6797e3c7322762bf532343092ab"} Nov 25 12:18:52 crc kubenswrapper[4693]: I1125 12:18:52.040460 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" event={"ID":"713d7d21-0b1a-4dac-a300-4974106541a7","Type":"ContainerStarted","Data":"a72191a0c7586f284c6e7c7ac0de843200b8280cc46d89a09f7310fc94c77e16"} Nov 25 12:18:52 crc kubenswrapper[4693]: I1125 12:18:52.040485 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" event={"ID":"713d7d21-0b1a-4dac-a300-4974106541a7","Type":"ContainerStarted","Data":"f0526289ae210a9a22ddac9737ad2a60f60ad924c697488e86b605152e835757"} Nov 25 12:18:52 crc kubenswrapper[4693]: I1125 12:18:52.040510 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" event={"ID":"713d7d21-0b1a-4dac-a300-4974106541a7","Type":"ContainerStarted","Data":"a36df2a8d4c1f58d50b6abd97241971e1c84e57d7fcac4f181bb72ff13128977"} Nov 25 12:18:52 crc kubenswrapper[4693]: I1125 12:18:52.040534 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" event={"ID":"713d7d21-0b1a-4dac-a300-4974106541a7","Type":"ContainerStarted","Data":"2551ba356f43bd82543a506821d70360a3c73e0b1c1fac637a78a39f7fc242fb"} Nov 25 12:18:52 crc kubenswrapper[4693]: I1125 12:18:52.820133 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c247f7d-6187-4052-baee-5c5841e1d9da" path="/var/lib/kubelet/pods/4c247f7d-6187-4052-baee-5c5841e1d9da/volumes" Nov 25 12:18:54 crc kubenswrapper[4693]: I1125 12:18:54.070368 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" event={"ID":"713d7d21-0b1a-4dac-a300-4974106541a7","Type":"ContainerStarted","Data":"32c127a5ab676e9cefa0131c92e855f105f8e2edbca9d8d2d48295c3a12706bf"} Nov 25 12:18:58 crc kubenswrapper[4693]: I1125 12:18:58.097430 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" event={"ID":"713d7d21-0b1a-4dac-a300-4974106541a7","Type":"ContainerStarted","Data":"3ac4182b7b6a5586fb8af3055e0b461f0fa1d7664415f233ed13bfee5d29b9fc"} Nov 25 12:18:58 crc kubenswrapper[4693]: I1125 12:18:58.097893 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:58 crc kubenswrapper[4693]: I1125 12:18:58.097906 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:58 crc kubenswrapper[4693]: I1125 12:18:58.097917 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:58 crc kubenswrapper[4693]: I1125 12:18:58.124601 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:18:58 crc kubenswrapper[4693]: I1125 12:18:58.128492 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" podStartSLOduration=8.128469593 podStartE2EDuration="8.128469593s" podCreationTimestamp="2025-11-25 12:18:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:18:58.1249649 +0000 UTC m=+658.043050291" watchObservedRunningTime="2025-11-25 12:18:58.128469593 +0000 UTC m=+658.046554974" Nov 25 12:18:58 crc kubenswrapper[4693]: I1125 12:18:58.134733 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:19:03 crc kubenswrapper[4693]: I1125 12:19:03.812338 4693 scope.go:117] "RemoveContainer" containerID="341e4691994213e289b6b6687d57d43f5c4a8981a11aa8daf3be474b270d87f7" Nov 25 12:19:03 crc kubenswrapper[4693]: E1125 12:19:03.812985 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6l9jx_openshift-multus(f714b419-cf37-48b7-9b1a-d36291d788a0)\"" pod="openshift-multus/multus-6l9jx" podUID="f714b419-cf37-48b7-9b1a-d36291d788a0" Nov 25 12:19:16 crc kubenswrapper[4693]: I1125 12:19:16.813558 4693 scope.go:117] "RemoveContainer" containerID="341e4691994213e289b6b6687d57d43f5c4a8981a11aa8daf3be474b270d87f7" Nov 25 12:19:18 crc kubenswrapper[4693]: I1125 12:19:18.225229 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6l9jx_f714b419-cf37-48b7-9b1a-d36291d788a0/kube-multus/2.log" Nov 25 12:19:18 crc kubenswrapper[4693]: I1125 12:19:18.225546 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6l9jx" event={"ID":"f714b419-cf37-48b7-9b1a-d36291d788a0","Type":"ContainerStarted","Data":"8aba608bfebd0ade55e250823249ae8ae6f4577cae7d3d43ddc15b2e6a21f2ad"} Nov 25 12:19:20 crc kubenswrapper[4693]: I1125 12:19:20.549610 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pztc6" Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.482903 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q"] Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.484446 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.486681 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.490706 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47fe04a3-31d1-4c8f-bccd-109447168f70-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q\" (UID: \"47fe04a3-31d1-4c8f-bccd-109447168f70\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.490759 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47fe04a3-31d1-4c8f-bccd-109447168f70-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q\" (UID: \"47fe04a3-31d1-4c8f-bccd-109447168f70\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.490782 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdpml\" (UniqueName: \"kubernetes.io/projected/47fe04a3-31d1-4c8f-bccd-109447168f70-kube-api-access-tdpml\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q\" (UID: \"47fe04a3-31d1-4c8f-bccd-109447168f70\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.495740 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q"] Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.592416 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47fe04a3-31d1-4c8f-bccd-109447168f70-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q\" (UID: \"47fe04a3-31d1-4c8f-bccd-109447168f70\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.592889 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47fe04a3-31d1-4c8f-bccd-109447168f70-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q\" (UID: \"47fe04a3-31d1-4c8f-bccd-109447168f70\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.593169 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47fe04a3-31d1-4c8f-bccd-109447168f70-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q\" (UID: \"47fe04a3-31d1-4c8f-bccd-109447168f70\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.593410 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47fe04a3-31d1-4c8f-bccd-109447168f70-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q\" (UID: \"47fe04a3-31d1-4c8f-bccd-109447168f70\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.593646 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdpml\" (UniqueName: \"kubernetes.io/projected/47fe04a3-31d1-4c8f-bccd-109447168f70-kube-api-access-tdpml\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q\" (UID: \"47fe04a3-31d1-4c8f-bccd-109447168f70\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.629524 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdpml\" (UniqueName: \"kubernetes.io/projected/47fe04a3-31d1-4c8f-bccd-109447168f70-kube-api-access-tdpml\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q\" (UID: \"47fe04a3-31d1-4c8f-bccd-109447168f70\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:36 crc kubenswrapper[4693]: I1125 12:19:36.803586 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:37 crc kubenswrapper[4693]: I1125 12:19:37.030771 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q"] Nov 25 12:19:37 crc kubenswrapper[4693]: I1125 12:19:37.338618 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" event={"ID":"47fe04a3-31d1-4c8f-bccd-109447168f70","Type":"ContainerStarted","Data":"bae43e9dbbffb8e1230252f8f3c098639659f16a5a7fd396b7feed36b3d8e981"} Nov 25 12:19:37 crc kubenswrapper[4693]: I1125 12:19:37.339023 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" event={"ID":"47fe04a3-31d1-4c8f-bccd-109447168f70","Type":"ContainerStarted","Data":"b95a4cc5b165d992775ab067a25425798123110738adee57090e3a79c824a494"} Nov 25 12:19:38 crc kubenswrapper[4693]: I1125 12:19:38.363055 4693 generic.go:334] "Generic (PLEG): container finished" podID="47fe04a3-31d1-4c8f-bccd-109447168f70" containerID="bae43e9dbbffb8e1230252f8f3c098639659f16a5a7fd396b7feed36b3d8e981" exitCode=0 Nov 25 12:19:38 crc kubenswrapper[4693]: I1125 12:19:38.363095 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" event={"ID":"47fe04a3-31d1-4c8f-bccd-109447168f70","Type":"ContainerDied","Data":"bae43e9dbbffb8e1230252f8f3c098639659f16a5a7fd396b7feed36b3d8e981"} Nov 25 12:19:40 crc kubenswrapper[4693]: I1125 12:19:40.375108 4693 generic.go:334] "Generic (PLEG): container finished" podID="47fe04a3-31d1-4c8f-bccd-109447168f70" containerID="46626236621eb1f4b9ed23d6283b5f92398b72e14cbe1b4e96fa9fa0e1ceac09" exitCode=0 Nov 25 12:19:40 crc kubenswrapper[4693]: I1125 12:19:40.375152 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" event={"ID":"47fe04a3-31d1-4c8f-bccd-109447168f70","Type":"ContainerDied","Data":"46626236621eb1f4b9ed23d6283b5f92398b72e14cbe1b4e96fa9fa0e1ceac09"} Nov 25 12:19:41 crc kubenswrapper[4693]: I1125 12:19:41.382825 4693 generic.go:334] "Generic (PLEG): container finished" podID="47fe04a3-31d1-4c8f-bccd-109447168f70" containerID="16cffbece3976de2c25a2aecad6bda3437b5e9b29b3f5173b5af598bd74d6d64" exitCode=0 Nov 25 12:19:41 crc kubenswrapper[4693]: I1125 12:19:41.382904 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" event={"ID":"47fe04a3-31d1-4c8f-bccd-109447168f70","Type":"ContainerDied","Data":"16cffbece3976de2c25a2aecad6bda3437b5e9b29b3f5173b5af598bd74d6d64"} Nov 25 12:19:42 crc kubenswrapper[4693]: I1125 12:19:42.636352 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:42 crc kubenswrapper[4693]: I1125 12:19:42.689207 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47fe04a3-31d1-4c8f-bccd-109447168f70-util\") pod \"47fe04a3-31d1-4c8f-bccd-109447168f70\" (UID: \"47fe04a3-31d1-4c8f-bccd-109447168f70\") " Nov 25 12:19:42 crc kubenswrapper[4693]: I1125 12:19:42.689317 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdpml\" (UniqueName: \"kubernetes.io/projected/47fe04a3-31d1-4c8f-bccd-109447168f70-kube-api-access-tdpml\") pod \"47fe04a3-31d1-4c8f-bccd-109447168f70\" (UID: \"47fe04a3-31d1-4c8f-bccd-109447168f70\") " Nov 25 12:19:42 crc kubenswrapper[4693]: I1125 12:19:42.689407 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47fe04a3-31d1-4c8f-bccd-109447168f70-bundle\") pod \"47fe04a3-31d1-4c8f-bccd-109447168f70\" (UID: \"47fe04a3-31d1-4c8f-bccd-109447168f70\") " Nov 25 12:19:42 crc kubenswrapper[4693]: I1125 12:19:42.690142 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47fe04a3-31d1-4c8f-bccd-109447168f70-bundle" (OuterVolumeSpecName: "bundle") pod "47fe04a3-31d1-4c8f-bccd-109447168f70" (UID: "47fe04a3-31d1-4c8f-bccd-109447168f70"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:19:42 crc kubenswrapper[4693]: I1125 12:19:42.695483 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47fe04a3-31d1-4c8f-bccd-109447168f70-kube-api-access-tdpml" (OuterVolumeSpecName: "kube-api-access-tdpml") pod "47fe04a3-31d1-4c8f-bccd-109447168f70" (UID: "47fe04a3-31d1-4c8f-bccd-109447168f70"). InnerVolumeSpecName "kube-api-access-tdpml". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:19:42 crc kubenswrapper[4693]: I1125 12:19:42.790241 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdpml\" (UniqueName: \"kubernetes.io/projected/47fe04a3-31d1-4c8f-bccd-109447168f70-kube-api-access-tdpml\") on node \"crc\" DevicePath \"\"" Nov 25 12:19:42 crc kubenswrapper[4693]: I1125 12:19:42.790286 4693 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47fe04a3-31d1-4c8f-bccd-109447168f70-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:19:43 crc kubenswrapper[4693]: I1125 12:19:43.040610 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47fe04a3-31d1-4c8f-bccd-109447168f70-util" (OuterVolumeSpecName: "util") pod "47fe04a3-31d1-4c8f-bccd-109447168f70" (UID: "47fe04a3-31d1-4c8f-bccd-109447168f70"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:19:43 crc kubenswrapper[4693]: I1125 12:19:43.101758 4693 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47fe04a3-31d1-4c8f-bccd-109447168f70-util\") on node \"crc\" DevicePath \"\"" Nov 25 12:19:43 crc kubenswrapper[4693]: I1125 12:19:43.396262 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" event={"ID":"47fe04a3-31d1-4c8f-bccd-109447168f70","Type":"ContainerDied","Data":"b95a4cc5b165d992775ab067a25425798123110738adee57090e3a79c824a494"} Nov 25 12:19:43 crc kubenswrapper[4693]: I1125 12:19:43.396308 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b95a4cc5b165d992775ab067a25425798123110738adee57090e3a79c824a494" Nov 25 12:19:43 crc kubenswrapper[4693]: I1125 12:19:43.396671 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.150538 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-vv6jm"] Nov 25 12:19:45 crc kubenswrapper[4693]: E1125 12:19:45.150762 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47fe04a3-31d1-4c8f-bccd-109447168f70" containerName="pull" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.150774 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="47fe04a3-31d1-4c8f-bccd-109447168f70" containerName="pull" Nov 25 12:19:45 crc kubenswrapper[4693]: E1125 12:19:45.150784 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47fe04a3-31d1-4c8f-bccd-109447168f70" containerName="extract" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.150791 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="47fe04a3-31d1-4c8f-bccd-109447168f70" containerName="extract" Nov 25 12:19:45 crc kubenswrapper[4693]: E1125 12:19:45.150802 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47fe04a3-31d1-4c8f-bccd-109447168f70" containerName="util" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.150809 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="47fe04a3-31d1-4c8f-bccd-109447168f70" containerName="util" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.150901 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="47fe04a3-31d1-4c8f-bccd-109447168f70" containerName="extract" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.151238 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-vv6jm" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.153217 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.153239 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-c4xvn" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.154208 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.164684 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-vv6jm"] Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.231013 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h94jt\" (UniqueName: \"kubernetes.io/projected/d8ae7877-8f8c-4fb0-bb42-ec809dcb6d4d-kube-api-access-h94jt\") pod \"nmstate-operator-557fdffb88-vv6jm\" (UID: \"d8ae7877-8f8c-4fb0-bb42-ec809dcb6d4d\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-vv6jm" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.332030 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h94jt\" (UniqueName: \"kubernetes.io/projected/d8ae7877-8f8c-4fb0-bb42-ec809dcb6d4d-kube-api-access-h94jt\") pod \"nmstate-operator-557fdffb88-vv6jm\" (UID: \"d8ae7877-8f8c-4fb0-bb42-ec809dcb6d4d\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-vv6jm" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.354563 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h94jt\" (UniqueName: \"kubernetes.io/projected/d8ae7877-8f8c-4fb0-bb42-ec809dcb6d4d-kube-api-access-h94jt\") pod \"nmstate-operator-557fdffb88-vv6jm\" (UID: \"d8ae7877-8f8c-4fb0-bb42-ec809dcb6d4d\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-vv6jm" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.467587 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-vv6jm" Nov 25 12:19:45 crc kubenswrapper[4693]: I1125 12:19:45.692197 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-vv6jm"] Nov 25 12:19:46 crc kubenswrapper[4693]: I1125 12:19:46.417791 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-vv6jm" event={"ID":"d8ae7877-8f8c-4fb0-bb42-ec809dcb6d4d","Type":"ContainerStarted","Data":"74a4d6d4bf0e1423b42314b74e7856f44782cc8aeee66b3bdec03dd1a7acd098"} Nov 25 12:19:48 crc kubenswrapper[4693]: I1125 12:19:48.439050 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-vv6jm" event={"ID":"d8ae7877-8f8c-4fb0-bb42-ec809dcb6d4d","Type":"ContainerStarted","Data":"8d3fcd3a8ebda55c390743c6d07de5e5ba9c9ac9a8f9a11b674ef2795c79a9e0"} Nov 25 12:19:48 crc kubenswrapper[4693]: I1125 12:19:48.460687 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-vv6jm" podStartSLOduration=1.4642919509999999 podStartE2EDuration="3.460668965s" podCreationTimestamp="2025-11-25 12:19:45 +0000 UTC" firstStartedPulling="2025-11-25 12:19:45.702448074 +0000 UTC m=+705.620533455" lastFinishedPulling="2025-11-25 12:19:47.698825048 +0000 UTC m=+707.616910469" observedRunningTime="2025-11-25 12:19:48.458207744 +0000 UTC m=+708.376293145" watchObservedRunningTime="2025-11-25 12:19:48.460668965 +0000 UTC m=+708.378754346" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.380949 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-sxzv8"] Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.381959 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-sxzv8" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.383548 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-km2nn" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.389802 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-sxzv8"] Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.397459 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz"] Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.398141 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.405677 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.423531 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz"] Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.446496 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-k95kj"] Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.447234 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.490171 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfcf4\" (UniqueName: \"kubernetes.io/projected/b2d6c353-42d6-4c35-8c14-925f97540979-kube-api-access-nfcf4\") pod \"nmstate-handler-k95kj\" (UID: \"b2d6c353-42d6-4c35-8c14-925f97540979\") " pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.490227 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b2d6c353-42d6-4c35-8c14-925f97540979-nmstate-lock\") pod \"nmstate-handler-k95kj\" (UID: \"b2d6c353-42d6-4c35-8c14-925f97540979\") " pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.490256 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg7p7\" (UniqueName: \"kubernetes.io/projected/27d9ec74-a9f1-4971-a6ad-16703ad324ad-kube-api-access-fg7p7\") pod \"nmstate-webhook-6b89b748d8-wv8pz\" (UID: \"27d9ec74-a9f1-4971-a6ad-16703ad324ad\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.490322 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b2d6c353-42d6-4c35-8c14-925f97540979-dbus-socket\") pod \"nmstate-handler-k95kj\" (UID: \"b2d6c353-42d6-4c35-8c14-925f97540979\") " pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.490346 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/27d9ec74-a9f1-4971-a6ad-16703ad324ad-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-wv8pz\" (UID: \"27d9ec74-a9f1-4971-a6ad-16703ad324ad\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.490417 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b2d6c353-42d6-4c35-8c14-925f97540979-ovs-socket\") pod \"nmstate-handler-k95kj\" (UID: \"b2d6c353-42d6-4c35-8c14-925f97540979\") " pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.490448 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkmdv\" (UniqueName: \"kubernetes.io/projected/0a6d6078-b39e-4528-a765-5624dee71294-kube-api-access-xkmdv\") pod \"nmstate-metrics-5dcf9c57c5-sxzv8\" (UID: \"0a6d6078-b39e-4528-a765-5624dee71294\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-sxzv8" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.510247 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g"] Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.511242 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.513309 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.513333 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-jxwmh" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.513351 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.524056 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g"] Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.591749 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fef9b8d4-8a67-486c-84d4-f0053c7efe32-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-wc25g\" (UID: \"fef9b8d4-8a67-486c-84d4-f0053c7efe32\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.591795 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b2d6c353-42d6-4c35-8c14-925f97540979-dbus-socket\") pod \"nmstate-handler-k95kj\" (UID: \"b2d6c353-42d6-4c35-8c14-925f97540979\") " pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.591815 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/27d9ec74-a9f1-4971-a6ad-16703ad324ad-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-wv8pz\" (UID: \"27d9ec74-a9f1-4971-a6ad-16703ad324ad\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.591842 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44xc2\" (UniqueName: \"kubernetes.io/projected/fef9b8d4-8a67-486c-84d4-f0053c7efe32-kube-api-access-44xc2\") pod \"nmstate-console-plugin-5874bd7bc5-wc25g\" (UID: \"fef9b8d4-8a67-486c-84d4-f0053c7efe32\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.591865 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fef9b8d4-8a67-486c-84d4-f0053c7efe32-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-wc25g\" (UID: \"fef9b8d4-8a67-486c-84d4-f0053c7efe32\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.591886 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b2d6c353-42d6-4c35-8c14-925f97540979-ovs-socket\") pod \"nmstate-handler-k95kj\" (UID: \"b2d6c353-42d6-4c35-8c14-925f97540979\") " pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.591907 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkmdv\" (UniqueName: \"kubernetes.io/projected/0a6d6078-b39e-4528-a765-5624dee71294-kube-api-access-xkmdv\") pod \"nmstate-metrics-5dcf9c57c5-sxzv8\" (UID: \"0a6d6078-b39e-4528-a765-5624dee71294\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-sxzv8" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.591939 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfcf4\" (UniqueName: \"kubernetes.io/projected/b2d6c353-42d6-4c35-8c14-925f97540979-kube-api-access-nfcf4\") pod \"nmstate-handler-k95kj\" (UID: \"b2d6c353-42d6-4c35-8c14-925f97540979\") " pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.591962 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b2d6c353-42d6-4c35-8c14-925f97540979-nmstate-lock\") pod \"nmstate-handler-k95kj\" (UID: \"b2d6c353-42d6-4c35-8c14-925f97540979\") " pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.591984 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg7p7\" (UniqueName: \"kubernetes.io/projected/27d9ec74-a9f1-4971-a6ad-16703ad324ad-kube-api-access-fg7p7\") pod \"nmstate-webhook-6b89b748d8-wv8pz\" (UID: \"27d9ec74-a9f1-4971-a6ad-16703ad324ad\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.592017 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b2d6c353-42d6-4c35-8c14-925f97540979-ovs-socket\") pod \"nmstate-handler-k95kj\" (UID: \"b2d6c353-42d6-4c35-8c14-925f97540979\") " pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.592068 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b2d6c353-42d6-4c35-8c14-925f97540979-nmstate-lock\") pod \"nmstate-handler-k95kj\" (UID: \"b2d6c353-42d6-4c35-8c14-925f97540979\") " pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.592289 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b2d6c353-42d6-4c35-8c14-925f97540979-dbus-socket\") pod \"nmstate-handler-k95kj\" (UID: \"b2d6c353-42d6-4c35-8c14-925f97540979\") " pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: E1125 12:19:49.594238 4693 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Nov 25 12:19:49 crc kubenswrapper[4693]: E1125 12:19:49.594349 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27d9ec74-a9f1-4971-a6ad-16703ad324ad-tls-key-pair podName:27d9ec74-a9f1-4971-a6ad-16703ad324ad nodeName:}" failed. No retries permitted until 2025-11-25 12:19:50.094325626 +0000 UTC m=+710.012411007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/27d9ec74-a9f1-4971-a6ad-16703ad324ad-tls-key-pair") pod "nmstate-webhook-6b89b748d8-wv8pz" (UID: "27d9ec74-a9f1-4971-a6ad-16703ad324ad") : secret "openshift-nmstate-webhook" not found Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.628150 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfcf4\" (UniqueName: \"kubernetes.io/projected/b2d6c353-42d6-4c35-8c14-925f97540979-kube-api-access-nfcf4\") pod \"nmstate-handler-k95kj\" (UID: \"b2d6c353-42d6-4c35-8c14-925f97540979\") " pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.628171 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkmdv\" (UniqueName: \"kubernetes.io/projected/0a6d6078-b39e-4528-a765-5624dee71294-kube-api-access-xkmdv\") pod \"nmstate-metrics-5dcf9c57c5-sxzv8\" (UID: \"0a6d6078-b39e-4528-a765-5624dee71294\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-sxzv8" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.630195 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg7p7\" (UniqueName: \"kubernetes.io/projected/27d9ec74-a9f1-4971-a6ad-16703ad324ad-kube-api-access-fg7p7\") pod \"nmstate-webhook-6b89b748d8-wv8pz\" (UID: \"27d9ec74-a9f1-4971-a6ad-16703ad324ad\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.693396 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fef9b8d4-8a67-486c-84d4-f0053c7efe32-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-wc25g\" (UID: \"fef9b8d4-8a67-486c-84d4-f0053c7efe32\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.693479 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44xc2\" (UniqueName: \"kubernetes.io/projected/fef9b8d4-8a67-486c-84d4-f0053c7efe32-kube-api-access-44xc2\") pod \"nmstate-console-plugin-5874bd7bc5-wc25g\" (UID: \"fef9b8d4-8a67-486c-84d4-f0053c7efe32\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.693502 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fef9b8d4-8a67-486c-84d4-f0053c7efe32-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-wc25g\" (UID: \"fef9b8d4-8a67-486c-84d4-f0053c7efe32\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" Nov 25 12:19:49 crc kubenswrapper[4693]: E1125 12:19:49.693615 4693 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Nov 25 12:19:49 crc kubenswrapper[4693]: E1125 12:19:49.693708 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fef9b8d4-8a67-486c-84d4-f0053c7efe32-plugin-serving-cert podName:fef9b8d4-8a67-486c-84d4-f0053c7efe32 nodeName:}" failed. No retries permitted until 2025-11-25 12:19:50.193684448 +0000 UTC m=+710.111769829 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/fef9b8d4-8a67-486c-84d4-f0053c7efe32-plugin-serving-cert") pod "nmstate-console-plugin-5874bd7bc5-wc25g" (UID: "fef9b8d4-8a67-486c-84d4-f0053c7efe32") : secret "plugin-serving-cert" not found Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.694416 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fef9b8d4-8a67-486c-84d4-f0053c7efe32-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-wc25g\" (UID: \"fef9b8d4-8a67-486c-84d4-f0053c7efe32\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.700359 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-sxzv8" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.720178 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44xc2\" (UniqueName: \"kubernetes.io/projected/fef9b8d4-8a67-486c-84d4-f0053c7efe32-kube-api-access-44xc2\") pod \"nmstate-console-plugin-5874bd7bc5-wc25g\" (UID: \"fef9b8d4-8a67-486c-84d4-f0053c7efe32\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.773263 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.785877 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cc6854c96-bsm7f"] Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.786770 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.799552 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cc6854c96-bsm7f"] Nov 25 12:19:49 crc kubenswrapper[4693]: W1125 12:19:49.831855 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2d6c353_42d6_4c35_8c14_925f97540979.slice/crio-0ff7aac28f759619d0888b4442a5aea032076a3560ea44cf4fdf7abb2112553b WatchSource:0}: Error finding container 0ff7aac28f759619d0888b4442a5aea032076a3560ea44cf4fdf7abb2112553b: Status 404 returned error can't find the container with id 0ff7aac28f759619d0888b4442a5aea032076a3560ea44cf4fdf7abb2112553b Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.895799 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-console-config\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.896175 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-service-ca\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.896821 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-console-oauth-config\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.896867 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgv6f\" (UniqueName: \"kubernetes.io/projected/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-kube-api-access-pgv6f\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.896889 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-oauth-serving-cert\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.896931 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-console-serving-cert\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.896970 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-trusted-ca-bundle\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.935600 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-sxzv8"] Nov 25 12:19:49 crc kubenswrapper[4693]: W1125 12:19:49.943748 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a6d6078_b39e_4528_a765_5624dee71294.slice/crio-fa3ff26febcf1c16dbb28c38313d8751894b24c5092e67493ed7509c1facc9c3 WatchSource:0}: Error finding container fa3ff26febcf1c16dbb28c38313d8751894b24c5092e67493ed7509c1facc9c3: Status 404 returned error can't find the container with id fa3ff26febcf1c16dbb28c38313d8751894b24c5092e67493ed7509c1facc9c3 Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.997876 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-console-oauth-config\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.997933 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgv6f\" (UniqueName: \"kubernetes.io/projected/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-kube-api-access-pgv6f\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.997959 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-oauth-serving-cert\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.997994 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-console-serving-cert\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.998032 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-trusted-ca-bundle\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.998063 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-console-config\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:49 crc kubenswrapper[4693]: I1125 12:19:49.998079 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-service-ca\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.099335 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/27d9ec74-a9f1-4971-a6ad-16703ad324ad-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-wv8pz\" (UID: \"27d9ec74-a9f1-4971-a6ad-16703ad324ad\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.126690 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-console-oauth-config\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.127415 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-service-ca\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.128246 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-console-config\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.129397 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-console-serving-cert\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.130231 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-oauth-serving-cert\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.130313 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-trusted-ca-bundle\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.142791 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/27d9ec74-a9f1-4971-a6ad-16703ad324ad-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-wv8pz\" (UID: \"27d9ec74-a9f1-4971-a6ad-16703ad324ad\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.202053 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fef9b8d4-8a67-486c-84d4-f0053c7efe32-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-wc25g\" (UID: \"fef9b8d4-8a67-486c-84d4-f0053c7efe32\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.206929 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fef9b8d4-8a67-486c-84d4-f0053c7efe32-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-wc25g\" (UID: \"fef9b8d4-8a67-486c-84d4-f0053c7efe32\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.260818 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgv6f\" (UniqueName: \"kubernetes.io/projected/b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1-kube-api-access-pgv6f\") pod \"console-6cc6854c96-bsm7f\" (UID: \"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1\") " pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.317687 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.436484 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.436517 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.456981 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-k95kj" event={"ID":"b2d6c353-42d6-4c35-8c14-925f97540979","Type":"ContainerStarted","Data":"0ff7aac28f759619d0888b4442a5aea032076a3560ea44cf4fdf7abb2112553b"} Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.458551 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-sxzv8" event={"ID":"0a6d6078-b39e-4528-a765-5624dee71294","Type":"ContainerStarted","Data":"fa3ff26febcf1c16dbb28c38313d8751894b24c5092e67493ed7509c1facc9c3"} Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.554442 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz"] Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.702422 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cc6854c96-bsm7f"] Nov 25 12:19:50 crc kubenswrapper[4693]: W1125 12:19:50.710690 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8088a8f_2ca3_4b2b_ac4d_d19da57f00d1.slice/crio-c9b10fe8daed66f8623bfc97ec1a3e36084ab73f6bddaee2ce2d537cfbbf925b WatchSource:0}: Error finding container c9b10fe8daed66f8623bfc97ec1a3e36084ab73f6bddaee2ce2d537cfbbf925b: Status 404 returned error can't find the container with id c9b10fe8daed66f8623bfc97ec1a3e36084ab73f6bddaee2ce2d537cfbbf925b Nov 25 12:19:50 crc kubenswrapper[4693]: I1125 12:19:50.884932 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g"] Nov 25 12:19:50 crc kubenswrapper[4693]: W1125 12:19:50.904262 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfef9b8d4_8a67_486c_84d4_f0053c7efe32.slice/crio-6439a5af00643a0a7966d8b344e19748fb9417cdd98ef8c8d5babd84493e0574 WatchSource:0}: Error finding container 6439a5af00643a0a7966d8b344e19748fb9417cdd98ef8c8d5babd84493e0574: Status 404 returned error can't find the container with id 6439a5af00643a0a7966d8b344e19748fb9417cdd98ef8c8d5babd84493e0574 Nov 25 12:19:51 crc kubenswrapper[4693]: I1125 12:19:51.467739 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" event={"ID":"fef9b8d4-8a67-486c-84d4-f0053c7efe32","Type":"ContainerStarted","Data":"6439a5af00643a0a7966d8b344e19748fb9417cdd98ef8c8d5babd84493e0574"} Nov 25 12:19:51 crc kubenswrapper[4693]: I1125 12:19:51.469257 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" event={"ID":"27d9ec74-a9f1-4971-a6ad-16703ad324ad","Type":"ContainerStarted","Data":"cc7e8e1840480806612ca5157949c84d2e303037f2ea4f7b7d19fb774f59b310"} Nov 25 12:19:51 crc kubenswrapper[4693]: I1125 12:19:51.471961 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cc6854c96-bsm7f" event={"ID":"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1","Type":"ContainerStarted","Data":"55894b62ee4a671441d657d116d71c07646b44ebffd6fe674709b432789a16b1"} Nov 25 12:19:51 crc kubenswrapper[4693]: I1125 12:19:51.472009 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cc6854c96-bsm7f" event={"ID":"b8088a8f-2ca3-4b2b-ac4d-d19da57f00d1","Type":"ContainerStarted","Data":"c9b10fe8daed66f8623bfc97ec1a3e36084ab73f6bddaee2ce2d537cfbbf925b"} Nov 25 12:19:51 crc kubenswrapper[4693]: I1125 12:19:51.498164 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cc6854c96-bsm7f" podStartSLOduration=2.498140615 podStartE2EDuration="2.498140615s" podCreationTimestamp="2025-11-25 12:19:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:19:51.491918574 +0000 UTC m=+711.410003955" watchObservedRunningTime="2025-11-25 12:19:51.498140615 +0000 UTC m=+711.416225996" Nov 25 12:19:54 crc kubenswrapper[4693]: I1125 12:19:54.495317 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" event={"ID":"27d9ec74-a9f1-4971-a6ad-16703ad324ad","Type":"ContainerStarted","Data":"3cb9b91d48359453854f972fe3445a0cc839a8dc6206f8c0d3d29510a4cefc35"} Nov 25 12:19:54 crc kubenswrapper[4693]: I1125 12:19:54.495741 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" Nov 25 12:19:54 crc kubenswrapper[4693]: I1125 12:19:54.498416 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" event={"ID":"fef9b8d4-8a67-486c-84d4-f0053c7efe32","Type":"ContainerStarted","Data":"5ca7d2f7bb5e5f5b976206a5931b94fb97284e43008d077e8f2c24dd53051114"} Nov 25 12:19:54 crc kubenswrapper[4693]: I1125 12:19:54.500223 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-k95kj" event={"ID":"b2d6c353-42d6-4c35-8c14-925f97540979","Type":"ContainerStarted","Data":"d32c81d651194e69c7c7eda301fbd8bbb87b26d7527b8abbd86c9e9fd81a8ac1"} Nov 25 12:19:54 crc kubenswrapper[4693]: I1125 12:19:54.500331 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:19:54 crc kubenswrapper[4693]: I1125 12:19:54.501642 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-sxzv8" event={"ID":"0a6d6078-b39e-4528-a765-5624dee71294","Type":"ContainerStarted","Data":"d14d3678d41f6662008b352e5219f24c82ce9b6b4624e8195604aba2de965d69"} Nov 25 12:19:54 crc kubenswrapper[4693]: I1125 12:19:54.529009 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-wc25g" podStartSLOduration=2.1229225290000002 podStartE2EDuration="5.528988742s" podCreationTimestamp="2025-11-25 12:19:49 +0000 UTC" firstStartedPulling="2025-11-25 12:19:50.907879114 +0000 UTC m=+710.825964515" lastFinishedPulling="2025-11-25 12:19:54.313945347 +0000 UTC m=+714.232030728" observedRunningTime="2025-11-25 12:19:54.528275702 +0000 UTC m=+714.446361083" watchObservedRunningTime="2025-11-25 12:19:54.528988742 +0000 UTC m=+714.447074133" Nov 25 12:19:54 crc kubenswrapper[4693]: I1125 12:19:54.538744 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" podStartSLOduration=1.808497176 podStartE2EDuration="5.538728678s" podCreationTimestamp="2025-11-25 12:19:49 +0000 UTC" firstStartedPulling="2025-11-25 12:19:50.577563254 +0000 UTC m=+710.495648635" lastFinishedPulling="2025-11-25 12:19:54.307794746 +0000 UTC m=+714.225880137" observedRunningTime="2025-11-25 12:19:54.514168408 +0000 UTC m=+714.432253799" watchObservedRunningTime="2025-11-25 12:19:54.538728678 +0000 UTC m=+714.456814059" Nov 25 12:19:54 crc kubenswrapper[4693]: I1125 12:19:54.551416 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-k95kj" podStartSLOduration=1.990119898 podStartE2EDuration="5.55035479s" podCreationTimestamp="2025-11-25 12:19:49 +0000 UTC" firstStartedPulling="2025-11-25 12:19:49.836425058 +0000 UTC m=+709.754510439" lastFinishedPulling="2025-11-25 12:19:53.39665995 +0000 UTC m=+713.314745331" observedRunningTime="2025-11-25 12:19:54.547314031 +0000 UTC m=+714.465399422" watchObservedRunningTime="2025-11-25 12:19:54.55035479 +0000 UTC m=+714.468440171" Nov 25 12:19:56 crc kubenswrapper[4693]: I1125 12:19:56.515007 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-sxzv8" event={"ID":"0a6d6078-b39e-4528-a765-5624dee71294","Type":"ContainerStarted","Data":"5c152655a1088a292bfd2bbdaaac0fffdd5370dd5009fe96a69f29b9afa2d89f"} Nov 25 12:19:56 crc kubenswrapper[4693]: I1125 12:19:56.538591 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-sxzv8" podStartSLOduration=1.232516379 podStartE2EDuration="7.538573894s" podCreationTimestamp="2025-11-25 12:19:49 +0000 UTC" firstStartedPulling="2025-11-25 12:19:49.946634015 +0000 UTC m=+709.864719386" lastFinishedPulling="2025-11-25 12:19:56.25269152 +0000 UTC m=+716.170776901" observedRunningTime="2025-11-25 12:19:56.537845672 +0000 UTC m=+716.455931073" watchObservedRunningTime="2025-11-25 12:19:56.538573894 +0000 UTC m=+716.456659295" Nov 25 12:19:59 crc kubenswrapper[4693]: I1125 12:19:59.795748 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-k95kj" Nov 25 12:20:00 crc kubenswrapper[4693]: I1125 12:20:00.437015 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:20:00 crc kubenswrapper[4693]: I1125 12:20:00.437054 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:20:00 crc kubenswrapper[4693]: I1125 12:20:00.442228 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:20:00 crc kubenswrapper[4693]: I1125 12:20:00.544567 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cc6854c96-bsm7f" Nov 25 12:20:00 crc kubenswrapper[4693]: I1125 12:20:00.603298 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4b2tf"] Nov 25 12:20:10 crc kubenswrapper[4693]: I1125 12:20:10.323110 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-wv8pz" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.454169 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m"] Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.455744 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.458219 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.468677 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m"] Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.625115 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m\" (UID: \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.625208 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r624v\" (UniqueName: \"kubernetes.io/projected/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-kube-api-access-r624v\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m\" (UID: \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.625238 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m\" (UID: \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.727057 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m\" (UID: \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.727174 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r624v\" (UniqueName: \"kubernetes.io/projected/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-kube-api-access-r624v\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m\" (UID: \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.727209 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m\" (UID: \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.727749 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m\" (UID: \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.727730 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m\" (UID: \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.750004 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r624v\" (UniqueName: \"kubernetes.io/projected/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-kube-api-access-r624v\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m\" (UID: \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.775837 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:22 crc kubenswrapper[4693]: I1125 12:20:22.988122 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m"] Nov 25 12:20:23 crc kubenswrapper[4693]: I1125 12:20:23.674320 4693 generic.go:334] "Generic (PLEG): container finished" podID="f6fa2a73-3c18-4d17-8c57-1698fa8d987b" containerID="15da0c4441c02269294334ab3912c6d2a2e363240decb5d58e8166fc851c7525" exitCode=0 Nov 25 12:20:23 crc kubenswrapper[4693]: I1125 12:20:23.674393 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" event={"ID":"f6fa2a73-3c18-4d17-8c57-1698fa8d987b","Type":"ContainerDied","Data":"15da0c4441c02269294334ab3912c6d2a2e363240decb5d58e8166fc851c7525"} Nov 25 12:20:23 crc kubenswrapper[4693]: I1125 12:20:23.674424 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" event={"ID":"f6fa2a73-3c18-4d17-8c57-1698fa8d987b","Type":"ContainerStarted","Data":"106e1709326e720725fe0e01ffc271943c457c49c86078e268a41eccc52223d6"} Nov 25 12:20:25 crc kubenswrapper[4693]: I1125 12:20:25.644092 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-4b2tf" podUID="725c1b7d-81c5-4bbe-99b1-c53b93754feb" containerName="console" containerID="cri-o://5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0" gracePeriod=15 Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.053236 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4b2tf_725c1b7d-81c5-4bbe-99b1-c53b93754feb/console/0.log" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.053700 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.185171 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-trusted-ca-bundle\") pod \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.185271 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-oauth-config\") pod \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.185324 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-service-ca\") pod \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.185358 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-serving-cert\") pod \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.185451 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcdsr\" (UniqueName: \"kubernetes.io/projected/725c1b7d-81c5-4bbe-99b1-c53b93754feb-kube-api-access-jcdsr\") pod \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.185488 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-config\") pod \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.185516 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-oauth-serving-cert\") pod \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\" (UID: \"725c1b7d-81c5-4bbe-99b1-c53b93754feb\") " Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.186287 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "725c1b7d-81c5-4bbe-99b1-c53b93754feb" (UID: "725c1b7d-81c5-4bbe-99b1-c53b93754feb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.186314 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "725c1b7d-81c5-4bbe-99b1-c53b93754feb" (UID: "725c1b7d-81c5-4bbe-99b1-c53b93754feb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.186675 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-service-ca" (OuterVolumeSpecName: "service-ca") pod "725c1b7d-81c5-4bbe-99b1-c53b93754feb" (UID: "725c1b7d-81c5-4bbe-99b1-c53b93754feb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.186930 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-config" (OuterVolumeSpecName: "console-config") pod "725c1b7d-81c5-4bbe-99b1-c53b93754feb" (UID: "725c1b7d-81c5-4bbe-99b1-c53b93754feb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.191711 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/725c1b7d-81c5-4bbe-99b1-c53b93754feb-kube-api-access-jcdsr" (OuterVolumeSpecName: "kube-api-access-jcdsr") pod "725c1b7d-81c5-4bbe-99b1-c53b93754feb" (UID: "725c1b7d-81c5-4bbe-99b1-c53b93754feb"). InnerVolumeSpecName "kube-api-access-jcdsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.192157 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "725c1b7d-81c5-4bbe-99b1-c53b93754feb" (UID: "725c1b7d-81c5-4bbe-99b1-c53b93754feb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.192778 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "725c1b7d-81c5-4bbe-99b1-c53b93754feb" (UID: "725c1b7d-81c5-4bbe-99b1-c53b93754feb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.286577 4693 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.286865 4693 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-service-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.286879 4693 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.286892 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcdsr\" (UniqueName: \"kubernetes.io/projected/725c1b7d-81c5-4bbe-99b1-c53b93754feb-kube-api-access-jcdsr\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.286906 4693 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-console-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.286918 4693 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.286931 4693 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/725c1b7d-81c5-4bbe-99b1-c53b93754feb-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.701104 4693 generic.go:334] "Generic (PLEG): container finished" podID="f6fa2a73-3c18-4d17-8c57-1698fa8d987b" containerID="786e771d9148da06f534e0e6b83c093159150d492417e3d033f2a22be9103122" exitCode=0 Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.701203 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" event={"ID":"f6fa2a73-3c18-4d17-8c57-1698fa8d987b","Type":"ContainerDied","Data":"786e771d9148da06f534e0e6b83c093159150d492417e3d033f2a22be9103122"} Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.702497 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-4b2tf_725c1b7d-81c5-4bbe-99b1-c53b93754feb/console/0.log" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.702538 4693 generic.go:334] "Generic (PLEG): container finished" podID="725c1b7d-81c5-4bbe-99b1-c53b93754feb" containerID="5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0" exitCode=2 Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.702563 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4b2tf" event={"ID":"725c1b7d-81c5-4bbe-99b1-c53b93754feb","Type":"ContainerDied","Data":"5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0"} Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.702586 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4b2tf" event={"ID":"725c1b7d-81c5-4bbe-99b1-c53b93754feb","Type":"ContainerDied","Data":"eb00f2113aa83042f6c2ef890aee61801c3239fd27fb53f9074ee3c4175991a5"} Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.702609 4693 scope.go:117] "RemoveContainer" containerID="5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.702740 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4b2tf" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.737978 4693 scope.go:117] "RemoveContainer" containerID="5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0" Nov 25 12:20:26 crc kubenswrapper[4693]: E1125 12:20:26.738574 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0\": container with ID starting with 5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0 not found: ID does not exist" containerID="5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.738632 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0"} err="failed to get container status \"5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0\": rpc error: code = NotFound desc = could not find container \"5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0\": container with ID starting with 5cfe9cbedced0083fdf6c1eb02da351f2db439d15ac91f823d4da943966535f0 not found: ID does not exist" Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.741650 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-4b2tf"] Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.747413 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-4b2tf"] Nov 25 12:20:26 crc kubenswrapper[4693]: I1125 12:20:26.820497 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="725c1b7d-81c5-4bbe-99b1-c53b93754feb" path="/var/lib/kubelet/pods/725c1b7d-81c5-4bbe-99b1-c53b93754feb/volumes" Nov 25 12:20:27 crc kubenswrapper[4693]: I1125 12:20:27.721528 4693 generic.go:334] "Generic (PLEG): container finished" podID="f6fa2a73-3c18-4d17-8c57-1698fa8d987b" containerID="b988ed07b8130694d2043da0cbddeaf3a695d3fe273c32ae9ef2569dcf8e0d3b" exitCode=0 Nov 25 12:20:27 crc kubenswrapper[4693]: I1125 12:20:27.721583 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" event={"ID":"f6fa2a73-3c18-4d17-8c57-1698fa8d987b","Type":"ContainerDied","Data":"b988ed07b8130694d2043da0cbddeaf3a695d3fe273c32ae9ef2569dcf8e0d3b"} Nov 25 12:20:28 crc kubenswrapper[4693]: I1125 12:20:28.987254 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.133795 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r624v\" (UniqueName: \"kubernetes.io/projected/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-kube-api-access-r624v\") pod \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\" (UID: \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\") " Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.133855 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-bundle\") pod \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\" (UID: \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\") " Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.133971 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-util\") pod \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\" (UID: \"f6fa2a73-3c18-4d17-8c57-1698fa8d987b\") " Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.135845 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-bundle" (OuterVolumeSpecName: "bundle") pod "f6fa2a73-3c18-4d17-8c57-1698fa8d987b" (UID: "f6fa2a73-3c18-4d17-8c57-1698fa8d987b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.142750 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-kube-api-access-r624v" (OuterVolumeSpecName: "kube-api-access-r624v") pod "f6fa2a73-3c18-4d17-8c57-1698fa8d987b" (UID: "f6fa2a73-3c18-4d17-8c57-1698fa8d987b"). InnerVolumeSpecName "kube-api-access-r624v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.229319 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-util" (OuterVolumeSpecName: "util") pod "f6fa2a73-3c18-4d17-8c57-1698fa8d987b" (UID: "f6fa2a73-3c18-4d17-8c57-1698fa8d987b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.235026 4693 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-util\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.235066 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r624v\" (UniqueName: \"kubernetes.io/projected/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-kube-api-access-r624v\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.235082 4693 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6fa2a73-3c18-4d17-8c57-1698fa8d987b-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.527980 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-52nbn"] Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.528223 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" podUID="413025b6-a706-4ad3-b920-2c9929ddaa0e" containerName="controller-manager" containerID="cri-o://bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c" gracePeriod=30 Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.643025 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf"] Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.643250 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" podUID="a0db65f0-ba9a-496f-a18e-edc1c84a3f0b" containerName="route-controller-manager" containerID="cri-o://7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d" gracePeriod=30 Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.739975 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" event={"ID":"f6fa2a73-3c18-4d17-8c57-1698fa8d987b","Type":"ContainerDied","Data":"106e1709326e720725fe0e01ffc271943c457c49c86078e268a41eccc52223d6"} Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.740010 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="106e1709326e720725fe0e01ffc271943c457c49c86078e268a41eccc52223d6" Nov 25 12:20:29 crc kubenswrapper[4693]: I1125 12:20:29.740049 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.501751 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.562253 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-client-ca\") pod \"413025b6-a706-4ad3-b920-2c9929ddaa0e\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.562406 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2z9c\" (UniqueName: \"kubernetes.io/projected/413025b6-a706-4ad3-b920-2c9929ddaa0e-kube-api-access-x2z9c\") pod \"413025b6-a706-4ad3-b920-2c9929ddaa0e\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.562436 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413025b6-a706-4ad3-b920-2c9929ddaa0e-serving-cert\") pod \"413025b6-a706-4ad3-b920-2c9929ddaa0e\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.562480 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-config\") pod \"413025b6-a706-4ad3-b920-2c9929ddaa0e\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.562505 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-proxy-ca-bundles\") pod \"413025b6-a706-4ad3-b920-2c9929ddaa0e\" (UID: \"413025b6-a706-4ad3-b920-2c9929ddaa0e\") " Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.562966 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-client-ca" (OuterVolumeSpecName: "client-ca") pod "413025b6-a706-4ad3-b920-2c9929ddaa0e" (UID: "413025b6-a706-4ad3-b920-2c9929ddaa0e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.564705 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "413025b6-a706-4ad3-b920-2c9929ddaa0e" (UID: "413025b6-a706-4ad3-b920-2c9929ddaa0e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.564948 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-config" (OuterVolumeSpecName: "config") pod "413025b6-a706-4ad3-b920-2c9929ddaa0e" (UID: "413025b6-a706-4ad3-b920-2c9929ddaa0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.581598 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/413025b6-a706-4ad3-b920-2c9929ddaa0e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "413025b6-a706-4ad3-b920-2c9929ddaa0e" (UID: "413025b6-a706-4ad3-b920-2c9929ddaa0e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.600686 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/413025b6-a706-4ad3-b920-2c9929ddaa0e-kube-api-access-x2z9c" (OuterVolumeSpecName: "kube-api-access-x2z9c") pod "413025b6-a706-4ad3-b920-2c9929ddaa0e" (UID: "413025b6-a706-4ad3-b920-2c9929ddaa0e"). InnerVolumeSpecName "kube-api-access-x2z9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.664340 4693 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.664396 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2z9c\" (UniqueName: \"kubernetes.io/projected/413025b6-a706-4ad3-b920-2c9929ddaa0e-kube-api-access-x2z9c\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.664407 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413025b6-a706-4ad3-b920-2c9929ddaa0e-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.664416 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.664424 4693 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/413025b6-a706-4ad3-b920-2c9929ddaa0e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.723668 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.751569 4693 generic.go:334] "Generic (PLEG): container finished" podID="413025b6-a706-4ad3-b920-2c9929ddaa0e" containerID="bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c" exitCode=0 Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.751658 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.751662 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" event={"ID":"413025b6-a706-4ad3-b920-2c9929ddaa0e","Type":"ContainerDied","Data":"bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c"} Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.751781 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-52nbn" event={"ID":"413025b6-a706-4ad3-b920-2c9929ddaa0e","Type":"ContainerDied","Data":"75b1b27f86fa619d65d7ff0dc6a126f56683fc781ed84bf3dbe3be76ecc66558"} Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.751802 4693 scope.go:117] "RemoveContainer" containerID="bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.754976 4693 generic.go:334] "Generic (PLEG): container finished" podID="a0db65f0-ba9a-496f-a18e-edc1c84a3f0b" containerID="7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d" exitCode=0 Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.755017 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" event={"ID":"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b","Type":"ContainerDied","Data":"7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d"} Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.755042 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" event={"ID":"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b","Type":"ContainerDied","Data":"15c4a35de0fbe2b6caf31df7c0b3bb7388d92a1d76aa302a2e1969f551f07abc"} Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.755094 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.764799 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-serving-cert\") pod \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.764880 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-config\") pod \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.764924 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-client-ca\") pod \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.764984 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkkk5\" (UniqueName: \"kubernetes.io/projected/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-kube-api-access-dkkk5\") pod \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\" (UID: \"a0db65f0-ba9a-496f-a18e-edc1c84a3f0b\") " Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.765648 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-config" (OuterVolumeSpecName: "config") pod "a0db65f0-ba9a-496f-a18e-edc1c84a3f0b" (UID: "a0db65f0-ba9a-496f-a18e-edc1c84a3f0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.765676 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-client-ca" (OuterVolumeSpecName: "client-ca") pod "a0db65f0-ba9a-496f-a18e-edc1c84a3f0b" (UID: "a0db65f0-ba9a-496f-a18e-edc1c84a3f0b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.767747 4693 scope.go:117] "RemoveContainer" containerID="bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c" Nov 25 12:20:30 crc kubenswrapper[4693]: E1125 12:20:30.768802 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c\": container with ID starting with bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c not found: ID does not exist" containerID="bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.768845 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c"} err="failed to get container status \"bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c\": rpc error: code = NotFound desc = could not find container \"bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c\": container with ID starting with bbc4017b8ae31ca7863f1ac3559c6eb76ca4353b9e3f594c84c8c25258c2bc1c not found: ID does not exist" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.768876 4693 scope.go:117] "RemoveContainer" containerID="7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.777001 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-kube-api-access-dkkk5" (OuterVolumeSpecName: "kube-api-access-dkkk5") pod "a0db65f0-ba9a-496f-a18e-edc1c84a3f0b" (UID: "a0db65f0-ba9a-496f-a18e-edc1c84a3f0b"). InnerVolumeSpecName "kube-api-access-dkkk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.777429 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a0db65f0-ba9a-496f-a18e-edc1c84a3f0b" (UID: "a0db65f0-ba9a-496f-a18e-edc1c84a3f0b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.787996 4693 scope.go:117] "RemoveContainer" containerID="7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d" Nov 25 12:20:30 crc kubenswrapper[4693]: E1125 12:20:30.788763 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d\": container with ID starting with 7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d not found: ID does not exist" containerID="7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.788814 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d"} err="failed to get container status \"7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d\": rpc error: code = NotFound desc = could not find container \"7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d\": container with ID starting with 7feefd6f065e4cf52f2a1cb3a68468b22e34f9f1ec50b623af7fec9a5a28570d not found: ID does not exist" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.789957 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-52nbn"] Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.794726 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-52nbn"] Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.823199 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="413025b6-a706-4ad3-b920-2c9929ddaa0e" path="/var/lib/kubelet/pods/413025b6-a706-4ad3-b920-2c9929ddaa0e/volumes" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.868075 4693 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.868120 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.868129 4693 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-client-ca\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.868139 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkkk5\" (UniqueName: \"kubernetes.io/projected/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b-kube-api-access-dkkk5\") on node \"crc\" DevicePath \"\"" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.984709 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx"] Nov 25 12:20:30 crc kubenswrapper[4693]: E1125 12:20:30.984987 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413025b6-a706-4ad3-b920-2c9929ddaa0e" containerName="controller-manager" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.985008 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="413025b6-a706-4ad3-b920-2c9929ddaa0e" containerName="controller-manager" Nov 25 12:20:30 crc kubenswrapper[4693]: E1125 12:20:30.985024 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0db65f0-ba9a-496f-a18e-edc1c84a3f0b" containerName="route-controller-manager" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.985033 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0db65f0-ba9a-496f-a18e-edc1c84a3f0b" containerName="route-controller-manager" Nov 25 12:20:30 crc kubenswrapper[4693]: E1125 12:20:30.985046 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725c1b7d-81c5-4bbe-99b1-c53b93754feb" containerName="console" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.985056 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="725c1b7d-81c5-4bbe-99b1-c53b93754feb" containerName="console" Nov 25 12:20:30 crc kubenswrapper[4693]: E1125 12:20:30.985071 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6fa2a73-3c18-4d17-8c57-1698fa8d987b" containerName="pull" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.985079 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fa2a73-3c18-4d17-8c57-1698fa8d987b" containerName="pull" Nov 25 12:20:30 crc kubenswrapper[4693]: E1125 12:20:30.985091 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6fa2a73-3c18-4d17-8c57-1698fa8d987b" containerName="util" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.985102 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fa2a73-3c18-4d17-8c57-1698fa8d987b" containerName="util" Nov 25 12:20:30 crc kubenswrapper[4693]: E1125 12:20:30.985114 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6fa2a73-3c18-4d17-8c57-1698fa8d987b" containerName="extract" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.985125 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6fa2a73-3c18-4d17-8c57-1698fa8d987b" containerName="extract" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.985253 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="413025b6-a706-4ad3-b920-2c9929ddaa0e" containerName="controller-manager" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.985268 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="725c1b7d-81c5-4bbe-99b1-c53b93754feb" containerName="console" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.985282 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6fa2a73-3c18-4d17-8c57-1698fa8d987b" containerName="extract" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.985294 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0db65f0-ba9a-496f-a18e-edc1c84a3f0b" containerName="route-controller-manager" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.985744 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.989655 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.989730 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.989821 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.989824 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.992682 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.992755 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 12:20:30 crc kubenswrapper[4693]: I1125 12:20:30.997893 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx"] Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.002497 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.070895 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cdfa954a-46eb-4605-9e94-dad9196b8003-proxy-ca-bundles\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.070955 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfa954a-46eb-4605-9e94-dad9196b8003-serving-cert\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.071007 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdfa954a-46eb-4605-9e94-dad9196b8003-config\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.071084 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdfa954a-46eb-4605-9e94-dad9196b8003-client-ca\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.071123 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w2nt\" (UniqueName: \"kubernetes.io/projected/cdfa954a-46eb-4605-9e94-dad9196b8003-kube-api-access-4w2nt\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.076617 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf"] Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.086237 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-slhjf"] Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.172259 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdfa954a-46eb-4605-9e94-dad9196b8003-client-ca\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.173528 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cdfa954a-46eb-4605-9e94-dad9196b8003-client-ca\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.173752 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w2nt\" (UniqueName: \"kubernetes.io/projected/cdfa954a-46eb-4605-9e94-dad9196b8003-kube-api-access-4w2nt\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.174163 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cdfa954a-46eb-4605-9e94-dad9196b8003-proxy-ca-bundles\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.174188 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfa954a-46eb-4605-9e94-dad9196b8003-serving-cert\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.174232 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdfa954a-46eb-4605-9e94-dad9196b8003-config\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.175123 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cdfa954a-46eb-4605-9e94-dad9196b8003-proxy-ca-bundles\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.176457 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cdfa954a-46eb-4605-9e94-dad9196b8003-config\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.179997 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdfa954a-46eb-4605-9e94-dad9196b8003-serving-cert\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.193125 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w2nt\" (UniqueName: \"kubernetes.io/projected/cdfa954a-46eb-4605-9e94-dad9196b8003-kube-api-access-4w2nt\") pod \"controller-manager-6d8b7f59f-q5rtx\" (UID: \"cdfa954a-46eb-4605-9e94-dad9196b8003\") " pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.306397 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.587028 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx"] Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.774571 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" event={"ID":"cdfa954a-46eb-4605-9e94-dad9196b8003","Type":"ContainerStarted","Data":"ec7433f92a2a8c289d217b3b2ce4997f09565d0b3239c837d625a34178a0eb4d"} Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.986473 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb"] Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.987175 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.989025 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.990661 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.991269 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.991607 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.991882 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 12:20:31 crc kubenswrapper[4693]: I1125 12:20:31.994609 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.000125 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb"] Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.086234 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99237415-d446-4a24-808a-0848dab2bb9c-config\") pod \"route-controller-manager-7946bb64b8-lp5cb\" (UID: \"99237415-d446-4a24-808a-0848dab2bb9c\") " pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.086291 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99237415-d446-4a24-808a-0848dab2bb9c-serving-cert\") pod \"route-controller-manager-7946bb64b8-lp5cb\" (UID: \"99237415-d446-4a24-808a-0848dab2bb9c\") " pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.086325 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdg4z\" (UniqueName: \"kubernetes.io/projected/99237415-d446-4a24-808a-0848dab2bb9c-kube-api-access-gdg4z\") pod \"route-controller-manager-7946bb64b8-lp5cb\" (UID: \"99237415-d446-4a24-808a-0848dab2bb9c\") " pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.086382 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99237415-d446-4a24-808a-0848dab2bb9c-client-ca\") pod \"route-controller-manager-7946bb64b8-lp5cb\" (UID: \"99237415-d446-4a24-808a-0848dab2bb9c\") " pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.187328 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdg4z\" (UniqueName: \"kubernetes.io/projected/99237415-d446-4a24-808a-0848dab2bb9c-kube-api-access-gdg4z\") pod \"route-controller-manager-7946bb64b8-lp5cb\" (UID: \"99237415-d446-4a24-808a-0848dab2bb9c\") " pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.187418 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99237415-d446-4a24-808a-0848dab2bb9c-client-ca\") pod \"route-controller-manager-7946bb64b8-lp5cb\" (UID: \"99237415-d446-4a24-808a-0848dab2bb9c\") " pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.187474 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99237415-d446-4a24-808a-0848dab2bb9c-config\") pod \"route-controller-manager-7946bb64b8-lp5cb\" (UID: \"99237415-d446-4a24-808a-0848dab2bb9c\") " pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.187490 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99237415-d446-4a24-808a-0848dab2bb9c-serving-cert\") pod \"route-controller-manager-7946bb64b8-lp5cb\" (UID: \"99237415-d446-4a24-808a-0848dab2bb9c\") " pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.188823 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99237415-d446-4a24-808a-0848dab2bb9c-client-ca\") pod \"route-controller-manager-7946bb64b8-lp5cb\" (UID: \"99237415-d446-4a24-808a-0848dab2bb9c\") " pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.189183 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99237415-d446-4a24-808a-0848dab2bb9c-config\") pod \"route-controller-manager-7946bb64b8-lp5cb\" (UID: \"99237415-d446-4a24-808a-0848dab2bb9c\") " pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.191867 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99237415-d446-4a24-808a-0848dab2bb9c-serving-cert\") pod \"route-controller-manager-7946bb64b8-lp5cb\" (UID: \"99237415-d446-4a24-808a-0848dab2bb9c\") " pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.204470 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdg4z\" (UniqueName: \"kubernetes.io/projected/99237415-d446-4a24-808a-0848dab2bb9c-kube-api-access-gdg4z\") pod \"route-controller-manager-7946bb64b8-lp5cb\" (UID: \"99237415-d446-4a24-808a-0848dab2bb9c\") " pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.303298 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.724813 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb"] Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.783272 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" event={"ID":"99237415-d446-4a24-808a-0848dab2bb9c","Type":"ContainerStarted","Data":"d879816631e67ac8efbebfc77977d2c7c5fc5d06b5af221c031407843781b4fb"} Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.785016 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" event={"ID":"cdfa954a-46eb-4605-9e94-dad9196b8003","Type":"ContainerStarted","Data":"c66baed4a82f13ac0d94132b21dd94cc4abb90093052a5650f0bd8fe21fafe5c"} Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.785872 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.796483 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.808621 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d8b7f59f-q5rtx" podStartSLOduration=3.808602533 podStartE2EDuration="3.808602533s" podCreationTimestamp="2025-11-25 12:20:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:20:32.806040848 +0000 UTC m=+752.724126249" watchObservedRunningTime="2025-11-25 12:20:32.808602533 +0000 UTC m=+752.726687914" Nov 25 12:20:32 crc kubenswrapper[4693]: I1125 12:20:32.824233 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0db65f0-ba9a-496f-a18e-edc1c84a3f0b" path="/var/lib/kubelet/pods/a0db65f0-ba9a-496f-a18e-edc1c84a3f0b/volumes" Nov 25 12:20:33 crc kubenswrapper[4693]: I1125 12:20:33.801051 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" event={"ID":"99237415-d446-4a24-808a-0848dab2bb9c","Type":"ContainerStarted","Data":"aebac9b503515b25d912ba6c89269754062a38a2e8187e5d9b3bafb26e6eca3c"} Nov 25 12:20:33 crc kubenswrapper[4693]: I1125 12:20:33.802076 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:33 crc kubenswrapper[4693]: I1125 12:20:33.808012 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" Nov 25 12:20:33 crc kubenswrapper[4693]: I1125 12:20:33.820724 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7946bb64b8-lp5cb" podStartSLOduration=3.820706573 podStartE2EDuration="3.820706573s" podCreationTimestamp="2025-11-25 12:20:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:20:33.815807069 +0000 UTC m=+753.733892450" watchObservedRunningTime="2025-11-25 12:20:33.820706573 +0000 UTC m=+753.738791954" Nov 25 12:20:35 crc kubenswrapper[4693]: I1125 12:20:35.114153 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:20:35 crc kubenswrapper[4693]: I1125 12:20:35.114486 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:20:36 crc kubenswrapper[4693]: I1125 12:20:36.693559 4693 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.386997 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc"] Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.387677 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.390174 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.390186 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.390283 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.390795 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-bbkk2" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.391763 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.411716 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc"] Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.544268 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d2b9e6f-fe11-47e3-af7b-cca0fff65798-apiservice-cert\") pod \"metallb-operator-controller-manager-5995bbfc5f-c8gkc\" (UID: \"0d2b9e6f-fe11-47e3-af7b-cca0fff65798\") " pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.544400 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d2b9e6f-fe11-47e3-af7b-cca0fff65798-webhook-cert\") pod \"metallb-operator-controller-manager-5995bbfc5f-c8gkc\" (UID: \"0d2b9e6f-fe11-47e3-af7b-cca0fff65798\") " pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.544493 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t8kh\" (UniqueName: \"kubernetes.io/projected/0d2b9e6f-fe11-47e3-af7b-cca0fff65798-kube-api-access-7t8kh\") pod \"metallb-operator-controller-manager-5995bbfc5f-c8gkc\" (UID: \"0d2b9e6f-fe11-47e3-af7b-cca0fff65798\") " pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.627831 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm"] Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.628487 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.631441 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-pgsm6" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.631451 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.637664 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.645910 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d2b9e6f-fe11-47e3-af7b-cca0fff65798-apiservice-cert\") pod \"metallb-operator-controller-manager-5995bbfc5f-c8gkc\" (UID: \"0d2b9e6f-fe11-47e3-af7b-cca0fff65798\") " pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.645968 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d2b9e6f-fe11-47e3-af7b-cca0fff65798-webhook-cert\") pod \"metallb-operator-controller-manager-5995bbfc5f-c8gkc\" (UID: \"0d2b9e6f-fe11-47e3-af7b-cca0fff65798\") " pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.646001 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t8kh\" (UniqueName: \"kubernetes.io/projected/0d2b9e6f-fe11-47e3-af7b-cca0fff65798-kube-api-access-7t8kh\") pod \"metallb-operator-controller-manager-5995bbfc5f-c8gkc\" (UID: \"0d2b9e6f-fe11-47e3-af7b-cca0fff65798\") " pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.647184 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm"] Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.652317 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d2b9e6f-fe11-47e3-af7b-cca0fff65798-webhook-cert\") pod \"metallb-operator-controller-manager-5995bbfc5f-c8gkc\" (UID: \"0d2b9e6f-fe11-47e3-af7b-cca0fff65798\") " pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.652351 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d2b9e6f-fe11-47e3-af7b-cca0fff65798-apiservice-cert\") pod \"metallb-operator-controller-manager-5995bbfc5f-c8gkc\" (UID: \"0d2b9e6f-fe11-47e3-af7b-cca0fff65798\") " pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.685665 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t8kh\" (UniqueName: \"kubernetes.io/projected/0d2b9e6f-fe11-47e3-af7b-cca0fff65798-kube-api-access-7t8kh\") pod \"metallb-operator-controller-manager-5995bbfc5f-c8gkc\" (UID: \"0d2b9e6f-fe11-47e3-af7b-cca0fff65798\") " pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.703931 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.746994 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c9e4728-76ad-4ae9-8ef9-87cff7db96c3-webhook-cert\") pod \"metallb-operator-webhook-server-7b574576ff-z9ftm\" (UID: \"9c9e4728-76ad-4ae9-8ef9-87cff7db96c3\") " pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.747073 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c9e4728-76ad-4ae9-8ef9-87cff7db96c3-apiservice-cert\") pod \"metallb-operator-webhook-server-7b574576ff-z9ftm\" (UID: \"9c9e4728-76ad-4ae9-8ef9-87cff7db96c3\") " pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.747247 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9p97\" (UniqueName: \"kubernetes.io/projected/9c9e4728-76ad-4ae9-8ef9-87cff7db96c3-kube-api-access-q9p97\") pod \"metallb-operator-webhook-server-7b574576ff-z9ftm\" (UID: \"9c9e4728-76ad-4ae9-8ef9-87cff7db96c3\") " pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.883317 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9p97\" (UniqueName: \"kubernetes.io/projected/9c9e4728-76ad-4ae9-8ef9-87cff7db96c3-kube-api-access-q9p97\") pod \"metallb-operator-webhook-server-7b574576ff-z9ftm\" (UID: \"9c9e4728-76ad-4ae9-8ef9-87cff7db96c3\") " pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.908725 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c9e4728-76ad-4ae9-8ef9-87cff7db96c3-webhook-cert\") pod \"metallb-operator-webhook-server-7b574576ff-z9ftm\" (UID: \"9c9e4728-76ad-4ae9-8ef9-87cff7db96c3\") " pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.908897 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c9e4728-76ad-4ae9-8ef9-87cff7db96c3-apiservice-cert\") pod \"metallb-operator-webhook-server-7b574576ff-z9ftm\" (UID: \"9c9e4728-76ad-4ae9-8ef9-87cff7db96c3\") " pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.928205 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c9e4728-76ad-4ae9-8ef9-87cff7db96c3-apiservice-cert\") pod \"metallb-operator-webhook-server-7b574576ff-z9ftm\" (UID: \"9c9e4728-76ad-4ae9-8ef9-87cff7db96c3\") " pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.957934 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c9e4728-76ad-4ae9-8ef9-87cff7db96c3-webhook-cert\") pod \"metallb-operator-webhook-server-7b574576ff-z9ftm\" (UID: \"9c9e4728-76ad-4ae9-8ef9-87cff7db96c3\") " pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:20:37 crc kubenswrapper[4693]: I1125 12:20:37.963892 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9p97\" (UniqueName: \"kubernetes.io/projected/9c9e4728-76ad-4ae9-8ef9-87cff7db96c3-kube-api-access-q9p97\") pod \"metallb-operator-webhook-server-7b574576ff-z9ftm\" (UID: \"9c9e4728-76ad-4ae9-8ef9-87cff7db96c3\") " pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:20:38 crc kubenswrapper[4693]: I1125 12:20:38.240015 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:20:38 crc kubenswrapper[4693]: I1125 12:20:38.282227 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc"] Nov 25 12:20:38 crc kubenswrapper[4693]: I1125 12:20:38.663491 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm"] Nov 25 12:20:38 crc kubenswrapper[4693]: W1125 12:20:38.673699 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c9e4728_76ad_4ae9_8ef9_87cff7db96c3.slice/crio-767c7dd4f82b0b9eefd100a932127ee08fea4fdc82e5327a45972db930884566 WatchSource:0}: Error finding container 767c7dd4f82b0b9eefd100a932127ee08fea4fdc82e5327a45972db930884566: Status 404 returned error can't find the container with id 767c7dd4f82b0b9eefd100a932127ee08fea4fdc82e5327a45972db930884566 Nov 25 12:20:38 crc kubenswrapper[4693]: I1125 12:20:38.829297 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" event={"ID":"0d2b9e6f-fe11-47e3-af7b-cca0fff65798","Type":"ContainerStarted","Data":"1f2de801e33f4a81b62e95596687465c89e0a8723c7b1229cd1715ba4816f344"} Nov 25 12:20:38 crc kubenswrapper[4693]: I1125 12:20:38.830331 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" event={"ID":"9c9e4728-76ad-4ae9-8ef9-87cff7db96c3","Type":"ContainerStarted","Data":"767c7dd4f82b0b9eefd100a932127ee08fea4fdc82e5327a45972db930884566"} Nov 25 12:20:44 crc kubenswrapper[4693]: I1125 12:20:44.872398 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" event={"ID":"0d2b9e6f-fe11-47e3-af7b-cca0fff65798","Type":"ContainerStarted","Data":"18c41683de3a4d0c38a6284ceb31c5e0f2f2a57df60a13a64d6b95c56a8faa33"} Nov 25 12:20:44 crc kubenswrapper[4693]: I1125 12:20:44.872964 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:20:44 crc kubenswrapper[4693]: I1125 12:20:44.892155 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" podStartSLOduration=2.157547064 podStartE2EDuration="7.8921358s" podCreationTimestamp="2025-11-25 12:20:37 +0000 UTC" firstStartedPulling="2025-11-25 12:20:38.305924 +0000 UTC m=+758.224009381" lastFinishedPulling="2025-11-25 12:20:44.040512716 +0000 UTC m=+763.958598117" observedRunningTime="2025-11-25 12:20:44.891181422 +0000 UTC m=+764.809266803" watchObservedRunningTime="2025-11-25 12:20:44.8921358 +0000 UTC m=+764.810221181" Nov 25 12:20:45 crc kubenswrapper[4693]: I1125 12:20:45.877917 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" event={"ID":"9c9e4728-76ad-4ae9-8ef9-87cff7db96c3","Type":"ContainerStarted","Data":"88e740d1fb73088cdbf800279582abae3170413498bb77aac40efc788d01fe3f"} Nov 25 12:20:45 crc kubenswrapper[4693]: I1125 12:20:45.898558 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" podStartSLOduration=2.344838276 podStartE2EDuration="8.898539312s" podCreationTimestamp="2025-11-25 12:20:37 +0000 UTC" firstStartedPulling="2025-11-25 12:20:38.676876729 +0000 UTC m=+758.594962110" lastFinishedPulling="2025-11-25 12:20:45.230577765 +0000 UTC m=+765.148663146" observedRunningTime="2025-11-25 12:20:45.89501362 +0000 UTC m=+765.813099041" watchObservedRunningTime="2025-11-25 12:20:45.898539312 +0000 UTC m=+765.816624723" Nov 25 12:20:46 crc kubenswrapper[4693]: I1125 12:20:46.882752 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:20:58 crc kubenswrapper[4693]: I1125 12:20:58.245485 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7b574576ff-z9ftm" Nov 25 12:21:05 crc kubenswrapper[4693]: I1125 12:21:05.114332 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:21:05 crc kubenswrapper[4693]: I1125 12:21:05.115078 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:21:17 crc kubenswrapper[4693]: I1125 12:21:17.709132 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.398134 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-tz5hq"] Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.400363 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.403321 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-p5jhb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.403469 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.404027 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.414673 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-54csl"] Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.415550 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.417507 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.423822 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-54csl"] Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.488329 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-dnzwb"] Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.489152 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dnzwb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.493885 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.494104 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.494213 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.494499 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-6qh5n" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.505874 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-m86lr"] Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.506733 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.508670 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.519900 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-m86lr"] Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547274 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wljd8\" (UniqueName: \"kubernetes.io/projected/43653fbc-4dc9-437e-a5f7-8cc774881d8a-kube-api-access-wljd8\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547329 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfxnj\" (UniqueName: \"kubernetes.io/projected/667fdb6a-4e0e-4b92-ae50-aa1880c69402-kube-api-access-tfxnj\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547360 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/43653fbc-4dc9-437e-a5f7-8cc774881d8a-frr-sockets\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547403 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/43653fbc-4dc9-437e-a5f7-8cc774881d8a-reloader\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547493 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43653fbc-4dc9-437e-a5f7-8cc774881d8a-metrics-certs\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547536 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/43653fbc-4dc9-437e-a5f7-8cc774881d8a-frr-conf\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547558 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b92f94fa-96a8-4257-890e-076b4292b487-cert\") pod \"frr-k8s-webhook-server-6998585d5-54csl\" (UID: \"b92f94fa-96a8-4257-890e-076b4292b487\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547586 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/667fdb6a-4e0e-4b92-ae50-aa1880c69402-memberlist\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547605 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/43653fbc-4dc9-437e-a5f7-8cc774881d8a-frr-startup\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547814 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/667fdb6a-4e0e-4b92-ae50-aa1880c69402-metallb-excludel2\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547901 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/667fdb6a-4e0e-4b92-ae50-aa1880c69402-metrics-certs\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547926 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/43653fbc-4dc9-437e-a5f7-8cc774881d8a-metrics\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.547956 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4jwh\" (UniqueName: \"kubernetes.io/projected/b92f94fa-96a8-4257-890e-076b4292b487-kube-api-access-z4jwh\") pod \"frr-k8s-webhook-server-6998585d5-54csl\" (UID: \"b92f94fa-96a8-4257-890e-076b4292b487\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.648991 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/667fdb6a-4e0e-4b92-ae50-aa1880c69402-metallb-excludel2\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649058 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knk6q\" (UniqueName: \"kubernetes.io/projected/89cc79c1-2d72-47b8-abcb-14af4fb9afe7-kube-api-access-knk6q\") pod \"controller-6c7b4b5f48-m86lr\" (UID: \"89cc79c1-2d72-47b8-abcb-14af4fb9afe7\") " pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649076 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/667fdb6a-4e0e-4b92-ae50-aa1880c69402-metrics-certs\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649092 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/43653fbc-4dc9-437e-a5f7-8cc774881d8a-metrics\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649109 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4jwh\" (UniqueName: \"kubernetes.io/projected/b92f94fa-96a8-4257-890e-076b4292b487-kube-api-access-z4jwh\") pod \"frr-k8s-webhook-server-6998585d5-54csl\" (UID: \"b92f94fa-96a8-4257-890e-076b4292b487\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649131 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wljd8\" (UniqueName: \"kubernetes.io/projected/43653fbc-4dc9-437e-a5f7-8cc774881d8a-kube-api-access-wljd8\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649153 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfxnj\" (UniqueName: \"kubernetes.io/projected/667fdb6a-4e0e-4b92-ae50-aa1880c69402-kube-api-access-tfxnj\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649176 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/43653fbc-4dc9-437e-a5f7-8cc774881d8a-frr-sockets\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649306 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/43653fbc-4dc9-437e-a5f7-8cc774881d8a-reloader\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649338 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43653fbc-4dc9-437e-a5f7-8cc774881d8a-metrics-certs\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649575 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b92f94fa-96a8-4257-890e-076b4292b487-cert\") pod \"frr-k8s-webhook-server-6998585d5-54csl\" (UID: \"b92f94fa-96a8-4257-890e-076b4292b487\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649621 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/43653fbc-4dc9-437e-a5f7-8cc774881d8a-frr-conf\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649641 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/43653fbc-4dc9-437e-a5f7-8cc774881d8a-metrics\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649644 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/667fdb6a-4e0e-4b92-ae50-aa1880c69402-memberlist\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649692 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/43653fbc-4dc9-437e-a5f7-8cc774881d8a-frr-sockets\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649712 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/43653fbc-4dc9-437e-a5f7-8cc774881d8a-frr-startup\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649812 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89cc79c1-2d72-47b8-abcb-14af4fb9afe7-metrics-certs\") pod \"controller-6c7b4b5f48-m86lr\" (UID: \"89cc79c1-2d72-47b8-abcb-14af4fb9afe7\") " pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649855 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cc79c1-2d72-47b8-abcb-14af4fb9afe7-cert\") pod \"controller-6c7b4b5f48-m86lr\" (UID: \"89cc79c1-2d72-47b8-abcb-14af4fb9afe7\") " pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649892 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/667fdb6a-4e0e-4b92-ae50-aa1880c69402-metallb-excludel2\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.649922 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/43653fbc-4dc9-437e-a5f7-8cc774881d8a-frr-conf\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: E1125 12:21:18.650003 4693 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 25 12:21:18 crc kubenswrapper[4693]: E1125 12:21:18.650081 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/667fdb6a-4e0e-4b92-ae50-aa1880c69402-memberlist podName:667fdb6a-4e0e-4b92-ae50-aa1880c69402 nodeName:}" failed. No retries permitted until 2025-11-25 12:21:19.150067613 +0000 UTC m=+799.068152994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/667fdb6a-4e0e-4b92-ae50-aa1880c69402-memberlist") pod "speaker-dnzwb" (UID: "667fdb6a-4e0e-4b92-ae50-aa1880c69402") : secret "metallb-memberlist" not found Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.650400 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/43653fbc-4dc9-437e-a5f7-8cc774881d8a-reloader\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.650685 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/43653fbc-4dc9-437e-a5f7-8cc774881d8a-frr-startup\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.655023 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/667fdb6a-4e0e-4b92-ae50-aa1880c69402-metrics-certs\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.655155 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43653fbc-4dc9-437e-a5f7-8cc774881d8a-metrics-certs\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.656859 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b92f94fa-96a8-4257-890e-076b4292b487-cert\") pod \"frr-k8s-webhook-server-6998585d5-54csl\" (UID: \"b92f94fa-96a8-4257-890e-076b4292b487\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.668745 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wljd8\" (UniqueName: \"kubernetes.io/projected/43653fbc-4dc9-437e-a5f7-8cc774881d8a-kube-api-access-wljd8\") pod \"frr-k8s-tz5hq\" (UID: \"43653fbc-4dc9-437e-a5f7-8cc774881d8a\") " pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.670625 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfxnj\" (UniqueName: \"kubernetes.io/projected/667fdb6a-4e0e-4b92-ae50-aa1880c69402-kube-api-access-tfxnj\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.671937 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4jwh\" (UniqueName: \"kubernetes.io/projected/b92f94fa-96a8-4257-890e-076b4292b487-kube-api-access-z4jwh\") pod \"frr-k8s-webhook-server-6998585d5-54csl\" (UID: \"b92f94fa-96a8-4257-890e-076b4292b487\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.727161 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.739842 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.750897 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89cc79c1-2d72-47b8-abcb-14af4fb9afe7-metrics-certs\") pod \"controller-6c7b4b5f48-m86lr\" (UID: \"89cc79c1-2d72-47b8-abcb-14af4fb9afe7\") " pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.750944 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cc79c1-2d72-47b8-abcb-14af4fb9afe7-cert\") pod \"controller-6c7b4b5f48-m86lr\" (UID: \"89cc79c1-2d72-47b8-abcb-14af4fb9afe7\") " pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.750989 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knk6q\" (UniqueName: \"kubernetes.io/projected/89cc79c1-2d72-47b8-abcb-14af4fb9afe7-kube-api-access-knk6q\") pod \"controller-6c7b4b5f48-m86lr\" (UID: \"89cc79c1-2d72-47b8-abcb-14af4fb9afe7\") " pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.752843 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.755028 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89cc79c1-2d72-47b8-abcb-14af4fb9afe7-metrics-certs\") pod \"controller-6c7b4b5f48-m86lr\" (UID: \"89cc79c1-2d72-47b8-abcb-14af4fb9afe7\") " pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.767901 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knk6q\" (UniqueName: \"kubernetes.io/projected/89cc79c1-2d72-47b8-abcb-14af4fb9afe7-kube-api-access-knk6q\") pod \"controller-6c7b4b5f48-m86lr\" (UID: \"89cc79c1-2d72-47b8-abcb-14af4fb9afe7\") " pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.767902 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cc79c1-2d72-47b8-abcb-14af4fb9afe7-cert\") pod \"controller-6c7b4b5f48-m86lr\" (UID: \"89cc79c1-2d72-47b8-abcb-14af4fb9afe7\") " pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:18 crc kubenswrapper[4693]: I1125 12:21:18.826826 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:19 crc kubenswrapper[4693]: I1125 12:21:19.035179 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-m86lr"] Nov 25 12:21:19 crc kubenswrapper[4693]: I1125 12:21:19.067272 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tz5hq" event={"ID":"43653fbc-4dc9-437e-a5f7-8cc774881d8a","Type":"ContainerStarted","Data":"0bed2bbf088aa398bddd0456b5f5112a5e5dd46ffee935d0e43be2a656c83475"} Nov 25 12:21:19 crc kubenswrapper[4693]: I1125 12:21:19.068349 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-m86lr" event={"ID":"89cc79c1-2d72-47b8-abcb-14af4fb9afe7","Type":"ContainerStarted","Data":"b9473b9181ce3a2a0cfc1ccda604aaedad274e05916c41a40e302b4635c5dedc"} Nov 25 12:21:19 crc kubenswrapper[4693]: I1125 12:21:19.135033 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-54csl"] Nov 25 12:21:19 crc kubenswrapper[4693]: W1125 12:21:19.139910 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92f94fa_96a8_4257_890e_076b4292b487.slice/crio-bac8ecbc848c38cd895785811d0ae9e62fa7ef86455ce9c82865eb983c2cd914 WatchSource:0}: Error finding container bac8ecbc848c38cd895785811d0ae9e62fa7ef86455ce9c82865eb983c2cd914: Status 404 returned error can't find the container with id bac8ecbc848c38cd895785811d0ae9e62fa7ef86455ce9c82865eb983c2cd914 Nov 25 12:21:19 crc kubenswrapper[4693]: I1125 12:21:19.157564 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/667fdb6a-4e0e-4b92-ae50-aa1880c69402-memberlist\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:19 crc kubenswrapper[4693]: E1125 12:21:19.157783 4693 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 25 12:21:19 crc kubenswrapper[4693]: E1125 12:21:19.157852 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/667fdb6a-4e0e-4b92-ae50-aa1880c69402-memberlist podName:667fdb6a-4e0e-4b92-ae50-aa1880c69402 nodeName:}" failed. No retries permitted until 2025-11-25 12:21:20.157834267 +0000 UTC m=+800.075919648 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/667fdb6a-4e0e-4b92-ae50-aa1880c69402-memberlist") pod "speaker-dnzwb" (UID: "667fdb6a-4e0e-4b92-ae50-aa1880c69402") : secret "metallb-memberlist" not found Nov 25 12:21:20 crc kubenswrapper[4693]: I1125 12:21:20.073553 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" event={"ID":"b92f94fa-96a8-4257-890e-076b4292b487","Type":"ContainerStarted","Data":"bac8ecbc848c38cd895785811d0ae9e62fa7ef86455ce9c82865eb983c2cd914"} Nov 25 12:21:20 crc kubenswrapper[4693]: I1125 12:21:20.078286 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-m86lr" event={"ID":"89cc79c1-2d72-47b8-abcb-14af4fb9afe7","Type":"ContainerStarted","Data":"e64278bd0d2ecc1e824eca0bdb3b3c51d467b2a48bd48439290cffea5104dd27"} Nov 25 12:21:20 crc kubenswrapper[4693]: I1125 12:21:20.078340 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-m86lr" event={"ID":"89cc79c1-2d72-47b8-abcb-14af4fb9afe7","Type":"ContainerStarted","Data":"06d2299152795c59ecaec553a77a05e7122cc5919f97675ce8365ddd0669943b"} Nov 25 12:21:20 crc kubenswrapper[4693]: I1125 12:21:20.078702 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:20 crc kubenswrapper[4693]: I1125 12:21:20.100119 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-m86lr" podStartSLOduration=2.10009347 podStartE2EDuration="2.10009347s" podCreationTimestamp="2025-11-25 12:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:21:20.093548395 +0000 UTC m=+800.011633806" watchObservedRunningTime="2025-11-25 12:21:20.10009347 +0000 UTC m=+800.018178851" Nov 25 12:21:20 crc kubenswrapper[4693]: I1125 12:21:20.170912 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/667fdb6a-4e0e-4b92-ae50-aa1880c69402-memberlist\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:20 crc kubenswrapper[4693]: I1125 12:21:20.176981 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/667fdb6a-4e0e-4b92-ae50-aa1880c69402-memberlist\") pod \"speaker-dnzwb\" (UID: \"667fdb6a-4e0e-4b92-ae50-aa1880c69402\") " pod="metallb-system/speaker-dnzwb" Nov 25 12:21:20 crc kubenswrapper[4693]: I1125 12:21:20.303344 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dnzwb" Nov 25 12:21:20 crc kubenswrapper[4693]: W1125 12:21:20.322503 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod667fdb6a_4e0e_4b92_ae50_aa1880c69402.slice/crio-a9e3f16a774adcdcd5368f3b8e34f4329185d573f1eac35c5968e45b8df9db2d WatchSource:0}: Error finding container a9e3f16a774adcdcd5368f3b8e34f4329185d573f1eac35c5968e45b8df9db2d: Status 404 returned error can't find the container with id a9e3f16a774adcdcd5368f3b8e34f4329185d573f1eac35c5968e45b8df9db2d Nov 25 12:21:21 crc kubenswrapper[4693]: I1125 12:21:21.100546 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dnzwb" event={"ID":"667fdb6a-4e0e-4b92-ae50-aa1880c69402","Type":"ContainerStarted","Data":"990a317985657efbaf35dc8076358fa24bf43a019b9ab5ab0fed89f8396c1255"} Nov 25 12:21:21 crc kubenswrapper[4693]: I1125 12:21:21.100858 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dnzwb" event={"ID":"667fdb6a-4e0e-4b92-ae50-aa1880c69402","Type":"ContainerStarted","Data":"a9e3f16a774adcdcd5368f3b8e34f4329185d573f1eac35c5968e45b8df9db2d"} Nov 25 12:21:22 crc kubenswrapper[4693]: I1125 12:21:22.109597 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dnzwb" event={"ID":"667fdb6a-4e0e-4b92-ae50-aa1880c69402","Type":"ContainerStarted","Data":"f4781de79e05246054767fdd3fa08d99e2290af55b4f8c9ea82e7270a128f751"} Nov 25 12:21:22 crc kubenswrapper[4693]: I1125 12:21:22.109883 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-dnzwb" Nov 25 12:21:22 crc kubenswrapper[4693]: I1125 12:21:22.134595 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-dnzwb" podStartSLOduration=4.134571234 podStartE2EDuration="4.134571234s" podCreationTimestamp="2025-11-25 12:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:21:22.128504368 +0000 UTC m=+802.046589759" watchObservedRunningTime="2025-11-25 12:21:22.134571234 +0000 UTC m=+802.052656635" Nov 25 12:21:29 crc kubenswrapper[4693]: I1125 12:21:29.152546 4693 generic.go:334] "Generic (PLEG): container finished" podID="43653fbc-4dc9-437e-a5f7-8cc774881d8a" containerID="fa7b0a66279ee53479332ee0073119eeceb0ea5e78543e6e5120abb4b3642a8c" exitCode=0 Nov 25 12:21:29 crc kubenswrapper[4693]: I1125 12:21:29.152604 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tz5hq" event={"ID":"43653fbc-4dc9-437e-a5f7-8cc774881d8a","Type":"ContainerDied","Data":"fa7b0a66279ee53479332ee0073119eeceb0ea5e78543e6e5120abb4b3642a8c"} Nov 25 12:21:29 crc kubenswrapper[4693]: I1125 12:21:29.155589 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" event={"ID":"b92f94fa-96a8-4257-890e-076b4292b487","Type":"ContainerStarted","Data":"b4288de5cb43b8be350f4edefad3df0e94e0c09c9c55fdb8cd06ad20addda279"} Nov 25 12:21:29 crc kubenswrapper[4693]: I1125 12:21:29.155869 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" Nov 25 12:21:29 crc kubenswrapper[4693]: I1125 12:21:29.193563 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" podStartSLOduration=1.990629897 podStartE2EDuration="11.193525774s" podCreationTimestamp="2025-11-25 12:21:18 +0000 UTC" firstStartedPulling="2025-11-25 12:21:19.141532394 +0000 UTC m=+799.059617775" lastFinishedPulling="2025-11-25 12:21:28.344428261 +0000 UTC m=+808.262513652" observedRunningTime="2025-11-25 12:21:29.18977067 +0000 UTC m=+809.107856051" watchObservedRunningTime="2025-11-25 12:21:29.193525774 +0000 UTC m=+809.111611155" Nov 25 12:21:30 crc kubenswrapper[4693]: I1125 12:21:30.162548 4693 generic.go:334] "Generic (PLEG): container finished" podID="43653fbc-4dc9-437e-a5f7-8cc774881d8a" containerID="b9636b685250cea5a381d844580230d73ded53ddb080dbc9448ea87a9ccc2bd1" exitCode=0 Nov 25 12:21:30 crc kubenswrapper[4693]: I1125 12:21:30.162613 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tz5hq" event={"ID":"43653fbc-4dc9-437e-a5f7-8cc774881d8a","Type":"ContainerDied","Data":"b9636b685250cea5a381d844580230d73ded53ddb080dbc9448ea87a9ccc2bd1"} Nov 25 12:21:30 crc kubenswrapper[4693]: I1125 12:21:30.311417 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-dnzwb" Nov 25 12:21:31 crc kubenswrapper[4693]: I1125 12:21:31.169557 4693 generic.go:334] "Generic (PLEG): container finished" podID="43653fbc-4dc9-437e-a5f7-8cc774881d8a" containerID="15fedd8b9d6032378ea0b677db29182074d528f6760768a3577ab5d42bae57aa" exitCode=0 Nov 25 12:21:31 crc kubenswrapper[4693]: I1125 12:21:31.169593 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tz5hq" event={"ID":"43653fbc-4dc9-437e-a5f7-8cc774881d8a","Type":"ContainerDied","Data":"15fedd8b9d6032378ea0b677db29182074d528f6760768a3577ab5d42bae57aa"} Nov 25 12:21:32 crc kubenswrapper[4693]: I1125 12:21:32.180467 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tz5hq" event={"ID":"43653fbc-4dc9-437e-a5f7-8cc774881d8a","Type":"ContainerStarted","Data":"6793250130d5bc9f73392985b5d5a66dd103b02a97622384cbda0e5a17d9ca4b"} Nov 25 12:21:32 crc kubenswrapper[4693]: I1125 12:21:32.180573 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tz5hq" event={"ID":"43653fbc-4dc9-437e-a5f7-8cc774881d8a","Type":"ContainerStarted","Data":"9eaec31910ab8f9a7a88e1ad420d40c4212b4be6e6bc180b93191ae16be1a5d7"} Nov 25 12:21:32 crc kubenswrapper[4693]: I1125 12:21:32.180589 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tz5hq" event={"ID":"43653fbc-4dc9-437e-a5f7-8cc774881d8a","Type":"ContainerStarted","Data":"e83f32b7584febc6c51cafc637cf1c5e02641deb6c2aa3ac3079db2460b86890"} Nov 25 12:21:33 crc kubenswrapper[4693]: I1125 12:21:33.194324 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tz5hq" event={"ID":"43653fbc-4dc9-437e-a5f7-8cc774881d8a","Type":"ContainerStarted","Data":"afcf6ca538902a0ad05a2d332d0887dcaa7de3092331d6e3309f4fa8a30bc300"} Nov 25 12:21:33 crc kubenswrapper[4693]: I1125 12:21:33.194707 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tz5hq" event={"ID":"43653fbc-4dc9-437e-a5f7-8cc774881d8a","Type":"ContainerStarted","Data":"480907313f57503355c28cffc70a86dc41031332ab9a89aad908101c9c207f11"} Nov 25 12:21:33 crc kubenswrapper[4693]: I1125 12:21:33.401935 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2dtgk"] Nov 25 12:21:33 crc kubenswrapper[4693]: I1125 12:21:33.402829 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2dtgk" Nov 25 12:21:33 crc kubenswrapper[4693]: I1125 12:21:33.405329 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 25 12:21:33 crc kubenswrapper[4693]: I1125 12:21:33.405548 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 25 12:21:33 crc kubenswrapper[4693]: I1125 12:21:33.405747 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-kjtkc" Nov 25 12:21:33 crc kubenswrapper[4693]: I1125 12:21:33.412903 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2dtgk"] Nov 25 12:21:33 crc kubenswrapper[4693]: I1125 12:21:33.588167 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd5hw\" (UniqueName: \"kubernetes.io/projected/1df10279-b5a2-48d6-b6d0-82eb83cf0485-kube-api-access-cd5hw\") pod \"openstack-operator-index-2dtgk\" (UID: \"1df10279-b5a2-48d6-b6d0-82eb83cf0485\") " pod="openstack-operators/openstack-operator-index-2dtgk" Nov 25 12:21:33 crc kubenswrapper[4693]: I1125 12:21:33.689992 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd5hw\" (UniqueName: \"kubernetes.io/projected/1df10279-b5a2-48d6-b6d0-82eb83cf0485-kube-api-access-cd5hw\") pod \"openstack-operator-index-2dtgk\" (UID: \"1df10279-b5a2-48d6-b6d0-82eb83cf0485\") " pod="openstack-operators/openstack-operator-index-2dtgk" Nov 25 12:21:33 crc kubenswrapper[4693]: I1125 12:21:33.708956 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd5hw\" (UniqueName: \"kubernetes.io/projected/1df10279-b5a2-48d6-b6d0-82eb83cf0485-kube-api-access-cd5hw\") pod \"openstack-operator-index-2dtgk\" (UID: \"1df10279-b5a2-48d6-b6d0-82eb83cf0485\") " pod="openstack-operators/openstack-operator-index-2dtgk" Nov 25 12:21:33 crc kubenswrapper[4693]: I1125 12:21:33.721057 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2dtgk" Nov 25 12:21:34 crc kubenswrapper[4693]: I1125 12:21:34.204960 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-tz5hq" event={"ID":"43653fbc-4dc9-437e-a5f7-8cc774881d8a","Type":"ContainerStarted","Data":"18fd711ba2e6e769d279e9b2fcb4d5db02b3e96b6ffe59a8bf6be20e1625e0fa"} Nov 25 12:21:34 crc kubenswrapper[4693]: I1125 12:21:34.205340 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:34 crc kubenswrapper[4693]: I1125 12:21:34.215204 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2dtgk"] Nov 25 12:21:34 crc kubenswrapper[4693]: W1125 12:21:34.225860 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1df10279_b5a2_48d6_b6d0_82eb83cf0485.slice/crio-e59a9e3e4f513501c7f1ec0b132fa80f5599fce869ff9f0d150cd1d1304889a3 WatchSource:0}: Error finding container e59a9e3e4f513501c7f1ec0b132fa80f5599fce869ff9f0d150cd1d1304889a3: Status 404 returned error can't find the container with id e59a9e3e4f513501c7f1ec0b132fa80f5599fce869ff9f0d150cd1d1304889a3 Nov 25 12:21:34 crc kubenswrapper[4693]: I1125 12:21:34.248283 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-tz5hq" podStartSLOduration=6.908028403 podStartE2EDuration="16.248263625s" podCreationTimestamp="2025-11-25 12:21:18 +0000 UTC" firstStartedPulling="2025-11-25 12:21:18.982420892 +0000 UTC m=+798.900506273" lastFinishedPulling="2025-11-25 12:21:28.322656114 +0000 UTC m=+808.240741495" observedRunningTime="2025-11-25 12:21:34.244344358 +0000 UTC m=+814.162429759" watchObservedRunningTime="2025-11-25 12:21:34.248263625 +0000 UTC m=+814.166349006" Nov 25 12:21:35 crc kubenswrapper[4693]: I1125 12:21:35.113718 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:21:35 crc kubenswrapper[4693]: I1125 12:21:35.114017 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:21:35 crc kubenswrapper[4693]: I1125 12:21:35.114059 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:21:35 crc kubenswrapper[4693]: I1125 12:21:35.114648 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c48b1bd5f615c180301fa268ce0ea0e2b9ab9ea9e6d73443257071ddeda6d194"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 12:21:35 crc kubenswrapper[4693]: I1125 12:21:35.114701 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://c48b1bd5f615c180301fa268ce0ea0e2b9ab9ea9e6d73443257071ddeda6d194" gracePeriod=600 Nov 25 12:21:35 crc kubenswrapper[4693]: I1125 12:21:35.212013 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2dtgk" event={"ID":"1df10279-b5a2-48d6-b6d0-82eb83cf0485","Type":"ContainerStarted","Data":"e59a9e3e4f513501c7f1ec0b132fa80f5599fce869ff9f0d150cd1d1304889a3"} Nov 25 12:21:36 crc kubenswrapper[4693]: I1125 12:21:36.220529 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="c48b1bd5f615c180301fa268ce0ea0e2b9ab9ea9e6d73443257071ddeda6d194" exitCode=0 Nov 25 12:21:36 crc kubenswrapper[4693]: I1125 12:21:36.220567 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"c48b1bd5f615c180301fa268ce0ea0e2b9ab9ea9e6d73443257071ddeda6d194"} Nov 25 12:21:36 crc kubenswrapper[4693]: I1125 12:21:36.220639 4693 scope.go:117] "RemoveContainer" containerID="c8db9f943783b89f4f1f5b7dfdca47ee2c64b3dd3be4d4df26e91fae510d1733" Nov 25 12:21:36 crc kubenswrapper[4693]: I1125 12:21:36.783519 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2dtgk"] Nov 25 12:21:37 crc kubenswrapper[4693]: I1125 12:21:37.231770 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"f1602027df59cd76a649d636d394ab648e039f6efe47c91bfe119cadecb3b352"} Nov 25 12:21:37 crc kubenswrapper[4693]: I1125 12:21:37.586634 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7fm7p"] Nov 25 12:21:37 crc kubenswrapper[4693]: I1125 12:21:37.587707 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7fm7p" Nov 25 12:21:37 crc kubenswrapper[4693]: I1125 12:21:37.594206 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7fm7p"] Nov 25 12:21:37 crc kubenswrapper[4693]: I1125 12:21:37.739542 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9htd9\" (UniqueName: \"kubernetes.io/projected/b14001de-fa88-4632-87e8-e5a4d703e633-kube-api-access-9htd9\") pod \"openstack-operator-index-7fm7p\" (UID: \"b14001de-fa88-4632-87e8-e5a4d703e633\") " pod="openstack-operators/openstack-operator-index-7fm7p" Nov 25 12:21:37 crc kubenswrapper[4693]: I1125 12:21:37.840841 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9htd9\" (UniqueName: \"kubernetes.io/projected/b14001de-fa88-4632-87e8-e5a4d703e633-kube-api-access-9htd9\") pod \"openstack-operator-index-7fm7p\" (UID: \"b14001de-fa88-4632-87e8-e5a4d703e633\") " pod="openstack-operators/openstack-operator-index-7fm7p" Nov 25 12:21:37 crc kubenswrapper[4693]: I1125 12:21:37.862252 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9htd9\" (UniqueName: \"kubernetes.io/projected/b14001de-fa88-4632-87e8-e5a4d703e633-kube-api-access-9htd9\") pod \"openstack-operator-index-7fm7p\" (UID: \"b14001de-fa88-4632-87e8-e5a4d703e633\") " pod="openstack-operators/openstack-operator-index-7fm7p" Nov 25 12:21:37 crc kubenswrapper[4693]: I1125 12:21:37.919094 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7fm7p" Nov 25 12:21:38 crc kubenswrapper[4693]: I1125 12:21:38.329790 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7fm7p"] Nov 25 12:21:38 crc kubenswrapper[4693]: I1125 12:21:38.734893 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:38 crc kubenswrapper[4693]: I1125 12:21:38.750552 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-54csl" Nov 25 12:21:38 crc kubenswrapper[4693]: I1125 12:21:38.785157 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:38 crc kubenswrapper[4693]: I1125 12:21:38.835467 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-m86lr" Nov 25 12:21:39 crc kubenswrapper[4693]: I1125 12:21:39.249406 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7fm7p" event={"ID":"b14001de-fa88-4632-87e8-e5a4d703e633","Type":"ContainerStarted","Data":"b0b81759dbae21f2e3023e110d3257567bcefccfc22970fa0d578a6d7f8ece77"} Nov 25 12:21:40 crc kubenswrapper[4693]: I1125 12:21:40.795108 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xlrpx"] Nov 25 12:21:40 crc kubenswrapper[4693]: I1125 12:21:40.796533 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:21:40 crc kubenswrapper[4693]: I1125 12:21:40.809158 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xlrpx"] Nov 25 12:21:40 crc kubenswrapper[4693]: I1125 12:21:40.988898 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/719aedbe-c024-4eae-b9ff-1104e3a16a11-catalog-content\") pod \"certified-operators-xlrpx\" (UID: \"719aedbe-c024-4eae-b9ff-1104e3a16a11\") " pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:21:40 crc kubenswrapper[4693]: I1125 12:21:40.988952 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9lwc\" (UniqueName: \"kubernetes.io/projected/719aedbe-c024-4eae-b9ff-1104e3a16a11-kube-api-access-r9lwc\") pod \"certified-operators-xlrpx\" (UID: \"719aedbe-c024-4eae-b9ff-1104e3a16a11\") " pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:21:40 crc kubenswrapper[4693]: I1125 12:21:40.989016 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/719aedbe-c024-4eae-b9ff-1104e3a16a11-utilities\") pod \"certified-operators-xlrpx\" (UID: \"719aedbe-c024-4eae-b9ff-1104e3a16a11\") " pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:21:41 crc kubenswrapper[4693]: I1125 12:21:41.089930 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/719aedbe-c024-4eae-b9ff-1104e3a16a11-catalog-content\") pod \"certified-operators-xlrpx\" (UID: \"719aedbe-c024-4eae-b9ff-1104e3a16a11\") " pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:21:41 crc kubenswrapper[4693]: I1125 12:21:41.090189 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9lwc\" (UniqueName: \"kubernetes.io/projected/719aedbe-c024-4eae-b9ff-1104e3a16a11-kube-api-access-r9lwc\") pod \"certified-operators-xlrpx\" (UID: \"719aedbe-c024-4eae-b9ff-1104e3a16a11\") " pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:21:41 crc kubenswrapper[4693]: I1125 12:21:41.090312 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/719aedbe-c024-4eae-b9ff-1104e3a16a11-utilities\") pod \"certified-operators-xlrpx\" (UID: \"719aedbe-c024-4eae-b9ff-1104e3a16a11\") " pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:21:41 crc kubenswrapper[4693]: I1125 12:21:41.090718 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/719aedbe-c024-4eae-b9ff-1104e3a16a11-utilities\") pod \"certified-operators-xlrpx\" (UID: \"719aedbe-c024-4eae-b9ff-1104e3a16a11\") " pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:21:41 crc kubenswrapper[4693]: I1125 12:21:41.090871 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/719aedbe-c024-4eae-b9ff-1104e3a16a11-catalog-content\") pod \"certified-operators-xlrpx\" (UID: \"719aedbe-c024-4eae-b9ff-1104e3a16a11\") " pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:21:41 crc kubenswrapper[4693]: I1125 12:21:41.125868 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9lwc\" (UniqueName: \"kubernetes.io/projected/719aedbe-c024-4eae-b9ff-1104e3a16a11-kube-api-access-r9lwc\") pod \"certified-operators-xlrpx\" (UID: \"719aedbe-c024-4eae-b9ff-1104e3a16a11\") " pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:21:41 crc kubenswrapper[4693]: I1125 12:21:41.415615 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:21:41 crc kubenswrapper[4693]: I1125 12:21:41.859360 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xlrpx"] Nov 25 12:21:41 crc kubenswrapper[4693]: W1125 12:21:41.868014 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod719aedbe_c024_4eae_b9ff_1104e3a16a11.slice/crio-67fb027f9d64c56bc14ebd30eb40a88de9deaf60bda6300e01ecf58919a1113a WatchSource:0}: Error finding container 67fb027f9d64c56bc14ebd30eb40a88de9deaf60bda6300e01ecf58919a1113a: Status 404 returned error can't find the container with id 67fb027f9d64c56bc14ebd30eb40a88de9deaf60bda6300e01ecf58919a1113a Nov 25 12:21:42 crc kubenswrapper[4693]: I1125 12:21:42.265468 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7fm7p" event={"ID":"b14001de-fa88-4632-87e8-e5a4d703e633","Type":"ContainerStarted","Data":"bfc65c07f895b57625f95ff73296161982e2b96301b486719b5d60556f0a53e5"} Nov 25 12:21:42 crc kubenswrapper[4693]: I1125 12:21:42.267641 4693 generic.go:334] "Generic (PLEG): container finished" podID="719aedbe-c024-4eae-b9ff-1104e3a16a11" containerID="f962c1702b342c3a656b841a98bbce491382b7d9724dfc64167832df28385ae9" exitCode=0 Nov 25 12:21:42 crc kubenswrapper[4693]: I1125 12:21:42.267707 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlrpx" event={"ID":"719aedbe-c024-4eae-b9ff-1104e3a16a11","Type":"ContainerDied","Data":"f962c1702b342c3a656b841a98bbce491382b7d9724dfc64167832df28385ae9"} Nov 25 12:21:42 crc kubenswrapper[4693]: I1125 12:21:42.267738 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlrpx" event={"ID":"719aedbe-c024-4eae-b9ff-1104e3a16a11","Type":"ContainerStarted","Data":"67fb027f9d64c56bc14ebd30eb40a88de9deaf60bda6300e01ecf58919a1113a"} Nov 25 12:21:42 crc kubenswrapper[4693]: I1125 12:21:42.269636 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2dtgk" event={"ID":"1df10279-b5a2-48d6-b6d0-82eb83cf0485","Type":"ContainerStarted","Data":"3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5"} Nov 25 12:21:42 crc kubenswrapper[4693]: I1125 12:21:42.269733 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-2dtgk" podUID="1df10279-b5a2-48d6-b6d0-82eb83cf0485" containerName="registry-server" containerID="cri-o://3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5" gracePeriod=2 Nov 25 12:21:42 crc kubenswrapper[4693]: I1125 12:21:42.292179 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7fm7p" podStartSLOduration=2.754445144 podStartE2EDuration="5.292156962s" podCreationTimestamp="2025-11-25 12:21:37 +0000 UTC" firstStartedPulling="2025-11-25 12:21:38.737003553 +0000 UTC m=+818.655088934" lastFinishedPulling="2025-11-25 12:21:41.274715371 +0000 UTC m=+821.192800752" observedRunningTime="2025-11-25 12:21:42.282287422 +0000 UTC m=+822.200372803" watchObservedRunningTime="2025-11-25 12:21:42.292156962 +0000 UTC m=+822.210242353" Nov 25 12:21:42 crc kubenswrapper[4693]: I1125 12:21:42.316728 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2dtgk" podStartSLOduration=2.2702794490000002 podStartE2EDuration="9.31670651s" podCreationTimestamp="2025-11-25 12:21:33 +0000 UTC" firstStartedPulling="2025-11-25 12:21:34.228252639 +0000 UTC m=+814.146338020" lastFinishedPulling="2025-11-25 12:21:41.2746797 +0000 UTC m=+821.192765081" observedRunningTime="2025-11-25 12:21:42.316054265 +0000 UTC m=+822.234139646" watchObservedRunningTime="2025-11-25 12:21:42.31670651 +0000 UTC m=+822.234791911" Nov 25 12:21:42 crc kubenswrapper[4693]: I1125 12:21:42.725692 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2dtgk" Nov 25 12:21:42 crc kubenswrapper[4693]: I1125 12:21:42.912076 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd5hw\" (UniqueName: \"kubernetes.io/projected/1df10279-b5a2-48d6-b6d0-82eb83cf0485-kube-api-access-cd5hw\") pod \"1df10279-b5a2-48d6-b6d0-82eb83cf0485\" (UID: \"1df10279-b5a2-48d6-b6d0-82eb83cf0485\") " Nov 25 12:21:42 crc kubenswrapper[4693]: I1125 12:21:42.917706 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df10279-b5a2-48d6-b6d0-82eb83cf0485-kube-api-access-cd5hw" (OuterVolumeSpecName: "kube-api-access-cd5hw") pod "1df10279-b5a2-48d6-b6d0-82eb83cf0485" (UID: "1df10279-b5a2-48d6-b6d0-82eb83cf0485"). InnerVolumeSpecName "kube-api-access-cd5hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:21:43 crc kubenswrapper[4693]: I1125 12:21:43.013643 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd5hw\" (UniqueName: \"kubernetes.io/projected/1df10279-b5a2-48d6-b6d0-82eb83cf0485-kube-api-access-cd5hw\") on node \"crc\" DevicePath \"\"" Nov 25 12:21:43 crc kubenswrapper[4693]: I1125 12:21:43.278304 4693 generic.go:334] "Generic (PLEG): container finished" podID="1df10279-b5a2-48d6-b6d0-82eb83cf0485" containerID="3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5" exitCode=0 Nov 25 12:21:43 crc kubenswrapper[4693]: I1125 12:21:43.278448 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2dtgk" Nov 25 12:21:43 crc kubenswrapper[4693]: I1125 12:21:43.278457 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2dtgk" event={"ID":"1df10279-b5a2-48d6-b6d0-82eb83cf0485","Type":"ContainerDied","Data":"3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5"} Nov 25 12:21:43 crc kubenswrapper[4693]: I1125 12:21:43.278515 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2dtgk" event={"ID":"1df10279-b5a2-48d6-b6d0-82eb83cf0485","Type":"ContainerDied","Data":"e59a9e3e4f513501c7f1ec0b132fa80f5599fce869ff9f0d150cd1d1304889a3"} Nov 25 12:21:43 crc kubenswrapper[4693]: I1125 12:21:43.278540 4693 scope.go:117] "RemoveContainer" containerID="3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5" Nov 25 12:21:43 crc kubenswrapper[4693]: I1125 12:21:43.308324 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2dtgk"] Nov 25 12:21:43 crc kubenswrapper[4693]: I1125 12:21:43.313999 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-2dtgk"] Nov 25 12:21:43 crc kubenswrapper[4693]: I1125 12:21:43.334130 4693 scope.go:117] "RemoveContainer" containerID="3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5" Nov 25 12:21:43 crc kubenswrapper[4693]: E1125 12:21:43.334515 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5\": container with ID starting with 3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5 not found: ID does not exist" containerID="3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5" Nov 25 12:21:43 crc kubenswrapper[4693]: I1125 12:21:43.334550 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5"} err="failed to get container status \"3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5\": rpc error: code = NotFound desc = could not find container \"3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5\": container with ID starting with 3ed7963a7879d4aa56f65c3074ac93344b46e54a3638e4930cfe43fa0e4763c5 not found: ID does not exist" Nov 25 12:21:44 crc kubenswrapper[4693]: I1125 12:21:44.288623 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlrpx" event={"ID":"719aedbe-c024-4eae-b9ff-1104e3a16a11","Type":"ContainerStarted","Data":"1b6812d8a30631a433062381a771d8ee30b21514cc9b466719cc6b4d97c60261"} Nov 25 12:21:44 crc kubenswrapper[4693]: I1125 12:21:44.823735 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df10279-b5a2-48d6-b6d0-82eb83cf0485" path="/var/lib/kubelet/pods/1df10279-b5a2-48d6-b6d0-82eb83cf0485/volumes" Nov 25 12:21:45 crc kubenswrapper[4693]: I1125 12:21:45.319758 4693 generic.go:334] "Generic (PLEG): container finished" podID="719aedbe-c024-4eae-b9ff-1104e3a16a11" containerID="1b6812d8a30631a433062381a771d8ee30b21514cc9b466719cc6b4d97c60261" exitCode=0 Nov 25 12:21:45 crc kubenswrapper[4693]: I1125 12:21:45.319812 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlrpx" event={"ID":"719aedbe-c024-4eae-b9ff-1104e3a16a11","Type":"ContainerDied","Data":"1b6812d8a30631a433062381a771d8ee30b21514cc9b466719cc6b4d97c60261"} Nov 25 12:21:47 crc kubenswrapper[4693]: I1125 12:21:47.919591 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7fm7p" Nov 25 12:21:47 crc kubenswrapper[4693]: I1125 12:21:47.920150 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-7fm7p" Nov 25 12:21:47 crc kubenswrapper[4693]: I1125 12:21:47.967460 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-7fm7p" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.002956 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6mt8k"] Nov 25 12:21:48 crc kubenswrapper[4693]: E1125 12:21:48.003218 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df10279-b5a2-48d6-b6d0-82eb83cf0485" containerName="registry-server" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.003230 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df10279-b5a2-48d6-b6d0-82eb83cf0485" containerName="registry-server" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.003349 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df10279-b5a2-48d6-b6d0-82eb83cf0485" containerName="registry-server" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.004100 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.012266 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mt8k"] Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.187256 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7be0eecb-730c-4048-b1ef-f35140cf3b67-catalog-content\") pod \"redhat-marketplace-6mt8k\" (UID: \"7be0eecb-730c-4048-b1ef-f35140cf3b67\") " pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.187333 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7be0eecb-730c-4048-b1ef-f35140cf3b67-utilities\") pod \"redhat-marketplace-6mt8k\" (UID: \"7be0eecb-730c-4048-b1ef-f35140cf3b67\") " pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.187426 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jl7f\" (UniqueName: \"kubernetes.io/projected/7be0eecb-730c-4048-b1ef-f35140cf3b67-kube-api-access-4jl7f\") pod \"redhat-marketplace-6mt8k\" (UID: \"7be0eecb-730c-4048-b1ef-f35140cf3b67\") " pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.288423 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7be0eecb-730c-4048-b1ef-f35140cf3b67-utilities\") pod \"redhat-marketplace-6mt8k\" (UID: \"7be0eecb-730c-4048-b1ef-f35140cf3b67\") " pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.288502 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jl7f\" (UniqueName: \"kubernetes.io/projected/7be0eecb-730c-4048-b1ef-f35140cf3b67-kube-api-access-4jl7f\") pod \"redhat-marketplace-6mt8k\" (UID: \"7be0eecb-730c-4048-b1ef-f35140cf3b67\") " pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.288527 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7be0eecb-730c-4048-b1ef-f35140cf3b67-catalog-content\") pod \"redhat-marketplace-6mt8k\" (UID: \"7be0eecb-730c-4048-b1ef-f35140cf3b67\") " pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.289027 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7be0eecb-730c-4048-b1ef-f35140cf3b67-catalog-content\") pod \"redhat-marketplace-6mt8k\" (UID: \"7be0eecb-730c-4048-b1ef-f35140cf3b67\") " pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.289322 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7be0eecb-730c-4048-b1ef-f35140cf3b67-utilities\") pod \"redhat-marketplace-6mt8k\" (UID: \"7be0eecb-730c-4048-b1ef-f35140cf3b67\") " pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.309037 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jl7f\" (UniqueName: \"kubernetes.io/projected/7be0eecb-730c-4048-b1ef-f35140cf3b67-kube-api-access-4jl7f\") pod \"redhat-marketplace-6mt8k\" (UID: \"7be0eecb-730c-4048-b1ef-f35140cf3b67\") " pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.321170 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.372571 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-7fm7p" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.731915 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-tz5hq" Nov 25 12:21:48 crc kubenswrapper[4693]: I1125 12:21:48.752701 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mt8k"] Nov 25 12:21:49 crc kubenswrapper[4693]: I1125 12:21:49.355335 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mt8k" event={"ID":"7be0eecb-730c-4048-b1ef-f35140cf3b67","Type":"ContainerStarted","Data":"6c10df8fa14795835093e99031ede513ad7fb5f1d7e76fdaaec7093e077398c2"} Nov 25 12:21:49 crc kubenswrapper[4693]: I1125 12:21:49.830346 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk"] Nov 25 12:21:49 crc kubenswrapper[4693]: I1125 12:21:49.832091 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:21:49 crc kubenswrapper[4693]: I1125 12:21:49.837004 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gb9t6" Nov 25 12:21:49 crc kubenswrapper[4693]: I1125 12:21:49.841414 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk"] Nov 25 12:21:49 crc kubenswrapper[4693]: I1125 12:21:49.920612 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c2f26eb-e680-4a45-8e01-bf653f711b07-util\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk\" (UID: \"7c2f26eb-e680-4a45-8e01-bf653f711b07\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:21:49 crc kubenswrapper[4693]: I1125 12:21:49.920696 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c2f26eb-e680-4a45-8e01-bf653f711b07-bundle\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk\" (UID: \"7c2f26eb-e680-4a45-8e01-bf653f711b07\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:21:49 crc kubenswrapper[4693]: I1125 12:21:49.920752 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc85c\" (UniqueName: \"kubernetes.io/projected/7c2f26eb-e680-4a45-8e01-bf653f711b07-kube-api-access-hc85c\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk\" (UID: \"7c2f26eb-e680-4a45-8e01-bf653f711b07\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:21:50 crc kubenswrapper[4693]: I1125 12:21:50.021898 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c2f26eb-e680-4a45-8e01-bf653f711b07-util\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk\" (UID: \"7c2f26eb-e680-4a45-8e01-bf653f711b07\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:21:50 crc kubenswrapper[4693]: I1125 12:21:50.021989 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c2f26eb-e680-4a45-8e01-bf653f711b07-bundle\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk\" (UID: \"7c2f26eb-e680-4a45-8e01-bf653f711b07\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:21:50 crc kubenswrapper[4693]: I1125 12:21:50.022043 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc85c\" (UniqueName: \"kubernetes.io/projected/7c2f26eb-e680-4a45-8e01-bf653f711b07-kube-api-access-hc85c\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk\" (UID: \"7c2f26eb-e680-4a45-8e01-bf653f711b07\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:21:50 crc kubenswrapper[4693]: I1125 12:21:50.022310 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c2f26eb-e680-4a45-8e01-bf653f711b07-util\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk\" (UID: \"7c2f26eb-e680-4a45-8e01-bf653f711b07\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:21:50 crc kubenswrapper[4693]: I1125 12:21:50.022671 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c2f26eb-e680-4a45-8e01-bf653f711b07-bundle\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk\" (UID: \"7c2f26eb-e680-4a45-8e01-bf653f711b07\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:21:50 crc kubenswrapper[4693]: I1125 12:21:50.046700 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc85c\" (UniqueName: \"kubernetes.io/projected/7c2f26eb-e680-4a45-8e01-bf653f711b07-kube-api-access-hc85c\") pod \"bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk\" (UID: \"7c2f26eb-e680-4a45-8e01-bf653f711b07\") " pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:21:50 crc kubenswrapper[4693]: I1125 12:21:50.153954 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:21:50 crc kubenswrapper[4693]: I1125 12:21:50.546481 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk"] Nov 25 12:21:50 crc kubenswrapper[4693]: W1125 12:21:50.563797 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c2f26eb_e680_4a45_8e01_bf653f711b07.slice/crio-e30e6674d06d9e308e923501dd605247a6fc1a805487e9b22821e556ac5e9a66 WatchSource:0}: Error finding container e30e6674d06d9e308e923501dd605247a6fc1a805487e9b22821e556ac5e9a66: Status 404 returned error can't find the container with id e30e6674d06d9e308e923501dd605247a6fc1a805487e9b22821e556ac5e9a66 Nov 25 12:21:51 crc kubenswrapper[4693]: I1125 12:21:51.369491 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" event={"ID":"7c2f26eb-e680-4a45-8e01-bf653f711b07","Type":"ContainerStarted","Data":"e30e6674d06d9e308e923501dd605247a6fc1a805487e9b22821e556ac5e9a66"} Nov 25 12:21:51 crc kubenswrapper[4693]: I1125 12:21:51.371842 4693 generic.go:334] "Generic (PLEG): container finished" podID="7be0eecb-730c-4048-b1ef-f35140cf3b67" containerID="174a2f002b034e3c990e51bd46e92eadf960dd8c4a50f9c8382fbb587ab4c5c8" exitCode=0 Nov 25 12:21:51 crc kubenswrapper[4693]: I1125 12:21:51.371900 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mt8k" event={"ID":"7be0eecb-730c-4048-b1ef-f35140cf3b67","Type":"ContainerDied","Data":"174a2f002b034e3c990e51bd46e92eadf960dd8c4a50f9c8382fbb587ab4c5c8"} Nov 25 12:21:52 crc kubenswrapper[4693]: I1125 12:21:52.388554 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" event={"ID":"7c2f26eb-e680-4a45-8e01-bf653f711b07","Type":"ContainerStarted","Data":"a974dc6da4d50f07c71556b0dc0a12001aa819cd0d141036bdc9d833a0f15159"} Nov 25 12:21:53 crc kubenswrapper[4693]: I1125 12:21:53.388684 4693 generic.go:334] "Generic (PLEG): container finished" podID="7c2f26eb-e680-4a45-8e01-bf653f711b07" containerID="a974dc6da4d50f07c71556b0dc0a12001aa819cd0d141036bdc9d833a0f15159" exitCode=0 Nov 25 12:21:53 crc kubenswrapper[4693]: I1125 12:21:53.388802 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" event={"ID":"7c2f26eb-e680-4a45-8e01-bf653f711b07","Type":"ContainerDied","Data":"a974dc6da4d50f07c71556b0dc0a12001aa819cd0d141036bdc9d833a0f15159"} Nov 25 12:21:56 crc kubenswrapper[4693]: I1125 12:21:56.412566 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlrpx" event={"ID":"719aedbe-c024-4eae-b9ff-1104e3a16a11","Type":"ContainerStarted","Data":"2c45a1def313667b32d8f71b323f30327bca01b05730975c4ece847a45635bb7"} Nov 25 12:21:56 crc kubenswrapper[4693]: I1125 12:21:56.415023 4693 generic.go:334] "Generic (PLEG): container finished" podID="7be0eecb-730c-4048-b1ef-f35140cf3b67" containerID="605fa8f704a5a80b640d2d01e2350f17a66f4616a1164f017a548872b8539048" exitCode=0 Nov 25 12:21:56 crc kubenswrapper[4693]: I1125 12:21:56.415070 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mt8k" event={"ID":"7be0eecb-730c-4048-b1ef-f35140cf3b67","Type":"ContainerDied","Data":"605fa8f704a5a80b640d2d01e2350f17a66f4616a1164f017a548872b8539048"} Nov 25 12:21:56 crc kubenswrapper[4693]: I1125 12:21:56.439900 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xlrpx" podStartSLOduration=3.7283288409999997 podStartE2EDuration="16.439876821s" podCreationTimestamp="2025-11-25 12:21:40 +0000 UTC" firstStartedPulling="2025-11-25 12:21:42.269243741 +0000 UTC m=+822.187329122" lastFinishedPulling="2025-11-25 12:21:54.980791681 +0000 UTC m=+834.898877102" observedRunningTime="2025-11-25 12:21:56.435527004 +0000 UTC m=+836.353612415" watchObservedRunningTime="2025-11-25 12:21:56.439876821 +0000 UTC m=+836.357962222" Nov 25 12:21:59 crc kubenswrapper[4693]: I1125 12:21:59.440498 4693 generic.go:334] "Generic (PLEG): container finished" podID="7c2f26eb-e680-4a45-8e01-bf653f711b07" containerID="14ec5accdbac1a3038332147e73cef624b6803dc699603823d2d928d4f16af7a" exitCode=0 Nov 25 12:21:59 crc kubenswrapper[4693]: I1125 12:21:59.440578 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" event={"ID":"7c2f26eb-e680-4a45-8e01-bf653f711b07","Type":"ContainerDied","Data":"14ec5accdbac1a3038332147e73cef624b6803dc699603823d2d928d4f16af7a"} Nov 25 12:22:01 crc kubenswrapper[4693]: I1125 12:22:01.415940 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:22:01 crc kubenswrapper[4693]: I1125 12:22:01.419279 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:22:01 crc kubenswrapper[4693]: I1125 12:22:01.475499 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:22:02 crc kubenswrapper[4693]: I1125 12:22:02.503805 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:22:03 crc kubenswrapper[4693]: I1125 12:22:03.474351 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mt8k" event={"ID":"7be0eecb-730c-4048-b1ef-f35140cf3b67","Type":"ContainerStarted","Data":"f56e7628ec900345f7e04747b3aa0ca5c1a58cd987a153baae00cde11c740d5f"} Nov 25 12:22:03 crc kubenswrapper[4693]: I1125 12:22:03.476122 4693 generic.go:334] "Generic (PLEG): container finished" podID="7c2f26eb-e680-4a45-8e01-bf653f711b07" containerID="52e5e951020ca85986be8e9ec49fcce86cba2c64aeacc6c7570389b24da6ad0b" exitCode=0 Nov 25 12:22:03 crc kubenswrapper[4693]: I1125 12:22:03.476177 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" event={"ID":"7c2f26eb-e680-4a45-8e01-bf653f711b07","Type":"ContainerDied","Data":"52e5e951020ca85986be8e9ec49fcce86cba2c64aeacc6c7570389b24da6ad0b"} Nov 25 12:22:03 crc kubenswrapper[4693]: I1125 12:22:03.491602 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6mt8k" podStartSLOduration=7.740898428 podStartE2EDuration="16.491587211s" podCreationTimestamp="2025-11-25 12:21:47 +0000 UTC" firstStartedPulling="2025-11-25 12:21:52.384753343 +0000 UTC m=+832.302838724" lastFinishedPulling="2025-11-25 12:22:01.135442116 +0000 UTC m=+841.053527507" observedRunningTime="2025-11-25 12:22:03.491428067 +0000 UTC m=+843.409513448" watchObservedRunningTime="2025-11-25 12:22:03.491587211 +0000 UTC m=+843.409672592" Nov 25 12:22:04 crc kubenswrapper[4693]: I1125 12:22:04.185254 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xlrpx"] Nov 25 12:22:04 crc kubenswrapper[4693]: I1125 12:22:04.483708 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xlrpx" podUID="719aedbe-c024-4eae-b9ff-1104e3a16a11" containerName="registry-server" containerID="cri-o://2c45a1def313667b32d8f71b323f30327bca01b05730975c4ece847a45635bb7" gracePeriod=2 Nov 25 12:22:04 crc kubenswrapper[4693]: I1125 12:22:04.761520 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:22:04 crc kubenswrapper[4693]: I1125 12:22:04.811816 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c2f26eb-e680-4a45-8e01-bf653f711b07-bundle\") pod \"7c2f26eb-e680-4a45-8e01-bf653f711b07\" (UID: \"7c2f26eb-e680-4a45-8e01-bf653f711b07\") " Nov 25 12:22:04 crc kubenswrapper[4693]: I1125 12:22:04.811907 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c2f26eb-e680-4a45-8e01-bf653f711b07-util\") pod \"7c2f26eb-e680-4a45-8e01-bf653f711b07\" (UID: \"7c2f26eb-e680-4a45-8e01-bf653f711b07\") " Nov 25 12:22:04 crc kubenswrapper[4693]: I1125 12:22:04.811933 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc85c\" (UniqueName: \"kubernetes.io/projected/7c2f26eb-e680-4a45-8e01-bf653f711b07-kube-api-access-hc85c\") pod \"7c2f26eb-e680-4a45-8e01-bf653f711b07\" (UID: \"7c2f26eb-e680-4a45-8e01-bf653f711b07\") " Nov 25 12:22:04 crc kubenswrapper[4693]: I1125 12:22:04.813087 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c2f26eb-e680-4a45-8e01-bf653f711b07-bundle" (OuterVolumeSpecName: "bundle") pod "7c2f26eb-e680-4a45-8e01-bf653f711b07" (UID: "7c2f26eb-e680-4a45-8e01-bf653f711b07"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:22:04 crc kubenswrapper[4693]: I1125 12:22:04.822684 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c2f26eb-e680-4a45-8e01-bf653f711b07-kube-api-access-hc85c" (OuterVolumeSpecName: "kube-api-access-hc85c") pod "7c2f26eb-e680-4a45-8e01-bf653f711b07" (UID: "7c2f26eb-e680-4a45-8e01-bf653f711b07"). InnerVolumeSpecName "kube-api-access-hc85c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:22:04 crc kubenswrapper[4693]: I1125 12:22:04.824365 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c2f26eb-e680-4a45-8e01-bf653f711b07-util" (OuterVolumeSpecName: "util") pod "7c2f26eb-e680-4a45-8e01-bf653f711b07" (UID: "7c2f26eb-e680-4a45-8e01-bf653f711b07"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:22:04 crc kubenswrapper[4693]: I1125 12:22:04.914076 4693 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c2f26eb-e680-4a45-8e01-bf653f711b07-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:04 crc kubenswrapper[4693]: I1125 12:22:04.914104 4693 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c2f26eb-e680-4a45-8e01-bf653f711b07-util\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:04 crc kubenswrapper[4693]: I1125 12:22:04.914113 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc85c\" (UniqueName: \"kubernetes.io/projected/7c2f26eb-e680-4a45-8e01-bf653f711b07-kube-api-access-hc85c\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.494214 4693 generic.go:334] "Generic (PLEG): container finished" podID="719aedbe-c024-4eae-b9ff-1104e3a16a11" containerID="2c45a1def313667b32d8f71b323f30327bca01b05730975c4ece847a45635bb7" exitCode=0 Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.494285 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlrpx" event={"ID":"719aedbe-c024-4eae-b9ff-1104e3a16a11","Type":"ContainerDied","Data":"2c45a1def313667b32d8f71b323f30327bca01b05730975c4ece847a45635bb7"} Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.497506 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" event={"ID":"7c2f26eb-e680-4a45-8e01-bf653f711b07","Type":"ContainerDied","Data":"e30e6674d06d9e308e923501dd605247a6fc1a805487e9b22821e556ac5e9a66"} Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.497544 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e30e6674d06d9e308e923501dd605247a6fc1a805487e9b22821e556ac5e9a66" Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.497623 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk" Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.732172 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.823772 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9lwc\" (UniqueName: \"kubernetes.io/projected/719aedbe-c024-4eae-b9ff-1104e3a16a11-kube-api-access-r9lwc\") pod \"719aedbe-c024-4eae-b9ff-1104e3a16a11\" (UID: \"719aedbe-c024-4eae-b9ff-1104e3a16a11\") " Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.823832 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/719aedbe-c024-4eae-b9ff-1104e3a16a11-catalog-content\") pod \"719aedbe-c024-4eae-b9ff-1104e3a16a11\" (UID: \"719aedbe-c024-4eae-b9ff-1104e3a16a11\") " Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.823893 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/719aedbe-c024-4eae-b9ff-1104e3a16a11-utilities\") pod \"719aedbe-c024-4eae-b9ff-1104e3a16a11\" (UID: \"719aedbe-c024-4eae-b9ff-1104e3a16a11\") " Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.824868 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719aedbe-c024-4eae-b9ff-1104e3a16a11-utilities" (OuterVolumeSpecName: "utilities") pod "719aedbe-c024-4eae-b9ff-1104e3a16a11" (UID: "719aedbe-c024-4eae-b9ff-1104e3a16a11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.828630 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719aedbe-c024-4eae-b9ff-1104e3a16a11-kube-api-access-r9lwc" (OuterVolumeSpecName: "kube-api-access-r9lwc") pod "719aedbe-c024-4eae-b9ff-1104e3a16a11" (UID: "719aedbe-c024-4eae-b9ff-1104e3a16a11"). InnerVolumeSpecName "kube-api-access-r9lwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.862468 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719aedbe-c024-4eae-b9ff-1104e3a16a11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "719aedbe-c024-4eae-b9ff-1104e3a16a11" (UID: "719aedbe-c024-4eae-b9ff-1104e3a16a11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.925518 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9lwc\" (UniqueName: \"kubernetes.io/projected/719aedbe-c024-4eae-b9ff-1104e3a16a11-kube-api-access-r9lwc\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.925573 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/719aedbe-c024-4eae-b9ff-1104e3a16a11-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:05 crc kubenswrapper[4693]: I1125 12:22:05.925592 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/719aedbe-c024-4eae-b9ff-1104e3a16a11-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:06 crc kubenswrapper[4693]: I1125 12:22:06.507302 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlrpx" event={"ID":"719aedbe-c024-4eae-b9ff-1104e3a16a11","Type":"ContainerDied","Data":"67fb027f9d64c56bc14ebd30eb40a88de9deaf60bda6300e01ecf58919a1113a"} Nov 25 12:22:06 crc kubenswrapper[4693]: I1125 12:22:06.507357 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xlrpx" Nov 25 12:22:06 crc kubenswrapper[4693]: I1125 12:22:06.508117 4693 scope.go:117] "RemoveContainer" containerID="2c45a1def313667b32d8f71b323f30327bca01b05730975c4ece847a45635bb7" Nov 25 12:22:06 crc kubenswrapper[4693]: I1125 12:22:06.536917 4693 scope.go:117] "RemoveContainer" containerID="1b6812d8a30631a433062381a771d8ee30b21514cc9b466719cc6b4d97c60261" Nov 25 12:22:06 crc kubenswrapper[4693]: I1125 12:22:06.543367 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xlrpx"] Nov 25 12:22:06 crc kubenswrapper[4693]: I1125 12:22:06.546727 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xlrpx"] Nov 25 12:22:06 crc kubenswrapper[4693]: I1125 12:22:06.575124 4693 scope.go:117] "RemoveContainer" containerID="f962c1702b342c3a656b841a98bbce491382b7d9724dfc64167832df28385ae9" Nov 25 12:22:06 crc kubenswrapper[4693]: I1125 12:22:06.821102 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719aedbe-c024-4eae-b9ff-1104e3a16a11" path="/var/lib/kubelet/pods/719aedbe-c024-4eae-b9ff-1104e3a16a11/volumes" Nov 25 12:22:08 crc kubenswrapper[4693]: I1125 12:22:08.321688 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:22:08 crc kubenswrapper[4693]: I1125 12:22:08.321731 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:22:08 crc kubenswrapper[4693]: I1125 12:22:08.356972 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:22:08 crc kubenswrapper[4693]: I1125 12:22:08.580160 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.183763 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mt8k"] Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.184338 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6mt8k" podUID="7be0eecb-730c-4048-b1ef-f35140cf3b67" containerName="registry-server" containerID="cri-o://f56e7628ec900345f7e04747b3aa0ca5c1a58cd987a153baae00cde11c740d5f" gracePeriod=2 Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.540322 4693 generic.go:334] "Generic (PLEG): container finished" podID="7be0eecb-730c-4048-b1ef-f35140cf3b67" containerID="f56e7628ec900345f7e04747b3aa0ca5c1a58cd987a153baae00cde11c740d5f" exitCode=0 Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.540368 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mt8k" event={"ID":"7be0eecb-730c-4048-b1ef-f35140cf3b67","Type":"ContainerDied","Data":"f56e7628ec900345f7e04747b3aa0ca5c1a58cd987a153baae00cde11c740d5f"} Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.655316 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d"] Nov 25 12:22:11 crc kubenswrapper[4693]: E1125 12:22:11.655613 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719aedbe-c024-4eae-b9ff-1104e3a16a11" containerName="extract-content" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.655631 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="719aedbe-c024-4eae-b9ff-1104e3a16a11" containerName="extract-content" Nov 25 12:22:11 crc kubenswrapper[4693]: E1125 12:22:11.655658 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719aedbe-c024-4eae-b9ff-1104e3a16a11" containerName="registry-server" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.655667 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="719aedbe-c024-4eae-b9ff-1104e3a16a11" containerName="registry-server" Nov 25 12:22:11 crc kubenswrapper[4693]: E1125 12:22:11.655680 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719aedbe-c024-4eae-b9ff-1104e3a16a11" containerName="extract-utilities" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.655687 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="719aedbe-c024-4eae-b9ff-1104e3a16a11" containerName="extract-utilities" Nov 25 12:22:11 crc kubenswrapper[4693]: E1125 12:22:11.655698 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c2f26eb-e680-4a45-8e01-bf653f711b07" containerName="pull" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.655704 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c2f26eb-e680-4a45-8e01-bf653f711b07" containerName="pull" Nov 25 12:22:11 crc kubenswrapper[4693]: E1125 12:22:11.655713 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c2f26eb-e680-4a45-8e01-bf653f711b07" containerName="extract" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.655720 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c2f26eb-e680-4a45-8e01-bf653f711b07" containerName="extract" Nov 25 12:22:11 crc kubenswrapper[4693]: E1125 12:22:11.655730 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c2f26eb-e680-4a45-8e01-bf653f711b07" containerName="util" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.655736 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c2f26eb-e680-4a45-8e01-bf653f711b07" containerName="util" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.655844 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c2f26eb-e680-4a45-8e01-bf653f711b07" containerName="extract" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.655856 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="719aedbe-c024-4eae-b9ff-1104e3a16a11" containerName="registry-server" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.656263 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.661557 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-mmnnt" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.686507 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d"] Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.842812 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srzpf\" (UniqueName: \"kubernetes.io/projected/ebf85cb6-2651-4b5f-9cbe-973db55e14c5-kube-api-access-srzpf\") pod \"openstack-operator-controller-operator-7b567956b5-gk28d\" (UID: \"ebf85cb6-2651-4b5f-9cbe-973db55e14c5\") " pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.944934 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srzpf\" (UniqueName: \"kubernetes.io/projected/ebf85cb6-2651-4b5f-9cbe-973db55e14c5-kube-api-access-srzpf\") pod \"openstack-operator-controller-operator-7b567956b5-gk28d\" (UID: \"ebf85cb6-2651-4b5f-9cbe-973db55e14c5\") " pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.969092 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srzpf\" (UniqueName: \"kubernetes.io/projected/ebf85cb6-2651-4b5f-9cbe-973db55e14c5-kube-api-access-srzpf\") pod \"openstack-operator-controller-operator-7b567956b5-gk28d\" (UID: \"ebf85cb6-2651-4b5f-9cbe-973db55e14c5\") " pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" Nov 25 12:22:11 crc kubenswrapper[4693]: I1125 12:22:11.979308 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.184467 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.350199 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7be0eecb-730c-4048-b1ef-f35140cf3b67-utilities\") pod \"7be0eecb-730c-4048-b1ef-f35140cf3b67\" (UID: \"7be0eecb-730c-4048-b1ef-f35140cf3b67\") " Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.350290 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7be0eecb-730c-4048-b1ef-f35140cf3b67-catalog-content\") pod \"7be0eecb-730c-4048-b1ef-f35140cf3b67\" (UID: \"7be0eecb-730c-4048-b1ef-f35140cf3b67\") " Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.350360 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jl7f\" (UniqueName: \"kubernetes.io/projected/7be0eecb-730c-4048-b1ef-f35140cf3b67-kube-api-access-4jl7f\") pod \"7be0eecb-730c-4048-b1ef-f35140cf3b67\" (UID: \"7be0eecb-730c-4048-b1ef-f35140cf3b67\") " Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.351488 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7be0eecb-730c-4048-b1ef-f35140cf3b67-utilities" (OuterVolumeSpecName: "utilities") pod "7be0eecb-730c-4048-b1ef-f35140cf3b67" (UID: "7be0eecb-730c-4048-b1ef-f35140cf3b67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.355606 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be0eecb-730c-4048-b1ef-f35140cf3b67-kube-api-access-4jl7f" (OuterVolumeSpecName: "kube-api-access-4jl7f") pod "7be0eecb-730c-4048-b1ef-f35140cf3b67" (UID: "7be0eecb-730c-4048-b1ef-f35140cf3b67"). InnerVolumeSpecName "kube-api-access-4jl7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.366724 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7be0eecb-730c-4048-b1ef-f35140cf3b67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7be0eecb-730c-4048-b1ef-f35140cf3b67" (UID: "7be0eecb-730c-4048-b1ef-f35140cf3b67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.451764 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7be0eecb-730c-4048-b1ef-f35140cf3b67-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.451799 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jl7f\" (UniqueName: \"kubernetes.io/projected/7be0eecb-730c-4048-b1ef-f35140cf3b67-kube-api-access-4jl7f\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.451815 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7be0eecb-730c-4048-b1ef-f35140cf3b67-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.501317 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d"] Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.545910 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" event={"ID":"ebf85cb6-2651-4b5f-9cbe-973db55e14c5","Type":"ContainerStarted","Data":"2aeb0d24ddd830e7f055ecbac2bc318955c77516da2a4dc5f64c6675ac38ede7"} Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.549110 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mt8k" event={"ID":"7be0eecb-730c-4048-b1ef-f35140cf3b67","Type":"ContainerDied","Data":"6c10df8fa14795835093e99031ede513ad7fb5f1d7e76fdaaec7093e077398c2"} Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.549175 4693 scope.go:117] "RemoveContainer" containerID="f56e7628ec900345f7e04747b3aa0ca5c1a58cd987a153baae00cde11c740d5f" Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.549181 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mt8k" Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.564814 4693 scope.go:117] "RemoveContainer" containerID="605fa8f704a5a80b640d2d01e2350f17a66f4616a1164f017a548872b8539048" Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.582087 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mt8k"] Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.590926 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mt8k"] Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.600633 4693 scope.go:117] "RemoveContainer" containerID="174a2f002b034e3c990e51bd46e92eadf960dd8c4a50f9c8382fbb587ab4c5c8" Nov 25 12:22:12 crc kubenswrapper[4693]: I1125 12:22:12.821958 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7be0eecb-730c-4048-b1ef-f35140cf3b67" path="/var/lib/kubelet/pods/7be0eecb-730c-4048-b1ef-f35140cf3b67/volumes" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.316326 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-24r56"] Nov 25 12:22:24 crc kubenswrapper[4693]: E1125 12:22:24.317031 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be0eecb-730c-4048-b1ef-f35140cf3b67" containerName="extract-content" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.317043 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be0eecb-730c-4048-b1ef-f35140cf3b67" containerName="extract-content" Nov 25 12:22:24 crc kubenswrapper[4693]: E1125 12:22:24.317057 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be0eecb-730c-4048-b1ef-f35140cf3b67" containerName="registry-server" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.317063 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be0eecb-730c-4048-b1ef-f35140cf3b67" containerName="registry-server" Nov 25 12:22:24 crc kubenswrapper[4693]: E1125 12:22:24.317075 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be0eecb-730c-4048-b1ef-f35140cf3b67" containerName="extract-utilities" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.317081 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be0eecb-730c-4048-b1ef-f35140cf3b67" containerName="extract-utilities" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.317181 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be0eecb-730c-4048-b1ef-f35140cf3b67" containerName="registry-server" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.318032 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.335691 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-24r56"] Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.420215 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64739058-feac-484c-bc60-3f15017a67d9-utilities\") pod \"community-operators-24r56\" (UID: \"64739058-feac-484c-bc60-3f15017a67d9\") " pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.420279 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6jrm\" (UniqueName: \"kubernetes.io/projected/64739058-feac-484c-bc60-3f15017a67d9-kube-api-access-j6jrm\") pod \"community-operators-24r56\" (UID: \"64739058-feac-484c-bc60-3f15017a67d9\") " pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.420355 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64739058-feac-484c-bc60-3f15017a67d9-catalog-content\") pod \"community-operators-24r56\" (UID: \"64739058-feac-484c-bc60-3f15017a67d9\") " pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.521119 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64739058-feac-484c-bc60-3f15017a67d9-utilities\") pod \"community-operators-24r56\" (UID: \"64739058-feac-484c-bc60-3f15017a67d9\") " pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.521173 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6jrm\" (UniqueName: \"kubernetes.io/projected/64739058-feac-484c-bc60-3f15017a67d9-kube-api-access-j6jrm\") pod \"community-operators-24r56\" (UID: \"64739058-feac-484c-bc60-3f15017a67d9\") " pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.521208 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64739058-feac-484c-bc60-3f15017a67d9-catalog-content\") pod \"community-operators-24r56\" (UID: \"64739058-feac-484c-bc60-3f15017a67d9\") " pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.521686 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64739058-feac-484c-bc60-3f15017a67d9-catalog-content\") pod \"community-operators-24r56\" (UID: \"64739058-feac-484c-bc60-3f15017a67d9\") " pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.521732 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64739058-feac-484c-bc60-3f15017a67d9-utilities\") pod \"community-operators-24r56\" (UID: \"64739058-feac-484c-bc60-3f15017a67d9\") " pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.543611 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6jrm\" (UniqueName: \"kubernetes.io/projected/64739058-feac-484c-bc60-3f15017a67d9-kube-api-access-j6jrm\") pod \"community-operators-24r56\" (UID: \"64739058-feac-484c-bc60-3f15017a67d9\") " pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:24 crc kubenswrapper[4693]: I1125 12:22:24.719663 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:25 crc kubenswrapper[4693]: I1125 12:22:25.307538 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-24r56"] Nov 25 12:22:25 crc kubenswrapper[4693]: I1125 12:22:25.652526 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24r56" event={"ID":"64739058-feac-484c-bc60-3f15017a67d9","Type":"ContainerStarted","Data":"df0691f0e8df51a47f026c24285f8e2f7d7ae4bc5e325036a3750ad4a0c2c2fe"} Nov 25 12:22:26 crc kubenswrapper[4693]: I1125 12:22:26.662080 4693 generic.go:334] "Generic (PLEG): container finished" podID="64739058-feac-484c-bc60-3f15017a67d9" containerID="7f0c5031b8dbcebad647f66f69a7a1ff491b1d5acc938074b0306b1e10bd5d4b" exitCode=0 Nov 25 12:22:26 crc kubenswrapper[4693]: I1125 12:22:26.662144 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24r56" event={"ID":"64739058-feac-484c-bc60-3f15017a67d9","Type":"ContainerDied","Data":"7f0c5031b8dbcebad647f66f69a7a1ff491b1d5acc938074b0306b1e10bd5d4b"} Nov 25 12:22:33 crc kubenswrapper[4693]: I1125 12:22:33.717581 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" event={"ID":"ebf85cb6-2651-4b5f-9cbe-973db55e14c5","Type":"ContainerStarted","Data":"3c51e8afbc12b65d0a94a15851f512a69a6009e5bec43aa4991a150caf98c439"} Nov 25 12:22:33 crc kubenswrapper[4693]: I1125 12:22:33.718282 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" Nov 25 12:22:33 crc kubenswrapper[4693]: I1125 12:22:33.719353 4693 generic.go:334] "Generic (PLEG): container finished" podID="64739058-feac-484c-bc60-3f15017a67d9" containerID="b45de559466efd9e9ade004f3c095ed8cf17d1c03218eef948eda6c78cd21e24" exitCode=0 Nov 25 12:22:33 crc kubenswrapper[4693]: I1125 12:22:33.719412 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24r56" event={"ID":"64739058-feac-484c-bc60-3f15017a67d9","Type":"ContainerDied","Data":"b45de559466efd9e9ade004f3c095ed8cf17d1c03218eef948eda6c78cd21e24"} Nov 25 12:22:33 crc kubenswrapper[4693]: I1125 12:22:33.763319 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" podStartSLOduration=2.702363777 podStartE2EDuration="22.763302034s" podCreationTimestamp="2025-11-25 12:22:11 +0000 UTC" firstStartedPulling="2025-11-25 12:22:12.514338266 +0000 UTC m=+852.432423657" lastFinishedPulling="2025-11-25 12:22:32.575276483 +0000 UTC m=+872.493361914" observedRunningTime="2025-11-25 12:22:33.762682065 +0000 UTC m=+873.680767456" watchObservedRunningTime="2025-11-25 12:22:33.763302034 +0000 UTC m=+873.681387425" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.164445 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tffpb"] Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.166010 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.173561 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tffpb"] Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.222807 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30914703-c614-44c5-8add-1a131dfdd142-utilities\") pod \"redhat-operators-tffpb\" (UID: \"30914703-c614-44c5-8add-1a131dfdd142\") " pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.222885 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30914703-c614-44c5-8add-1a131dfdd142-catalog-content\") pod \"redhat-operators-tffpb\" (UID: \"30914703-c614-44c5-8add-1a131dfdd142\") " pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.222921 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r499v\" (UniqueName: \"kubernetes.io/projected/30914703-c614-44c5-8add-1a131dfdd142-kube-api-access-r499v\") pod \"redhat-operators-tffpb\" (UID: \"30914703-c614-44c5-8add-1a131dfdd142\") " pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.323800 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30914703-c614-44c5-8add-1a131dfdd142-utilities\") pod \"redhat-operators-tffpb\" (UID: \"30914703-c614-44c5-8add-1a131dfdd142\") " pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.323884 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30914703-c614-44c5-8add-1a131dfdd142-catalog-content\") pod \"redhat-operators-tffpb\" (UID: \"30914703-c614-44c5-8add-1a131dfdd142\") " pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.323916 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r499v\" (UniqueName: \"kubernetes.io/projected/30914703-c614-44c5-8add-1a131dfdd142-kube-api-access-r499v\") pod \"redhat-operators-tffpb\" (UID: \"30914703-c614-44c5-8add-1a131dfdd142\") " pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.324452 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30914703-c614-44c5-8add-1a131dfdd142-catalog-content\") pod \"redhat-operators-tffpb\" (UID: \"30914703-c614-44c5-8add-1a131dfdd142\") " pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.324461 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30914703-c614-44c5-8add-1a131dfdd142-utilities\") pod \"redhat-operators-tffpb\" (UID: \"30914703-c614-44c5-8add-1a131dfdd142\") " pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.346469 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r499v\" (UniqueName: \"kubernetes.io/projected/30914703-c614-44c5-8add-1a131dfdd142-kube-api-access-r499v\") pod \"redhat-operators-tffpb\" (UID: \"30914703-c614-44c5-8add-1a131dfdd142\") " pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.547026 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.742454 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24r56" event={"ID":"64739058-feac-484c-bc60-3f15017a67d9","Type":"ContainerStarted","Data":"a3a87e6fa22763971a2d0fac3384cf1d6f60e3dbd16ad5d71be6b22e6cbbbac0"} Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.782930 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-24r56" podStartSLOduration=3.519637664 podStartE2EDuration="11.78291265s" podCreationTimestamp="2025-11-25 12:22:24 +0000 UTC" firstStartedPulling="2025-11-25 12:22:26.664474197 +0000 UTC m=+866.582559588" lastFinishedPulling="2025-11-25 12:22:34.927749193 +0000 UTC m=+874.845834574" observedRunningTime="2025-11-25 12:22:35.762389837 +0000 UTC m=+875.680475228" watchObservedRunningTime="2025-11-25 12:22:35.78291265 +0000 UTC m=+875.700998031" Nov 25 12:22:35 crc kubenswrapper[4693]: I1125 12:22:35.784193 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tffpb"] Nov 25 12:22:35 crc kubenswrapper[4693]: W1125 12:22:35.791938 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30914703_c614_44c5_8add_1a131dfdd142.slice/crio-1e4f0c3cb2dc238119b6b491f4ce66321c26478b6071036dd1746737138ad989 WatchSource:0}: Error finding container 1e4f0c3cb2dc238119b6b491f4ce66321c26478b6071036dd1746737138ad989: Status 404 returned error can't find the container with id 1e4f0c3cb2dc238119b6b491f4ce66321c26478b6071036dd1746737138ad989 Nov 25 12:22:36 crc kubenswrapper[4693]: I1125 12:22:36.750480 4693 generic.go:334] "Generic (PLEG): container finished" podID="30914703-c614-44c5-8add-1a131dfdd142" containerID="f84ae78822ac89d244de04741941b55d20ef57cd9ce6d7e6afb5db66d5d63b48" exitCode=0 Nov 25 12:22:36 crc kubenswrapper[4693]: I1125 12:22:36.750559 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tffpb" event={"ID":"30914703-c614-44c5-8add-1a131dfdd142","Type":"ContainerDied","Data":"f84ae78822ac89d244de04741941b55d20ef57cd9ce6d7e6afb5db66d5d63b48"} Nov 25 12:22:36 crc kubenswrapper[4693]: I1125 12:22:36.750994 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tffpb" event={"ID":"30914703-c614-44c5-8add-1a131dfdd142","Type":"ContainerStarted","Data":"1e4f0c3cb2dc238119b6b491f4ce66321c26478b6071036dd1746737138ad989"} Nov 25 12:22:38 crc kubenswrapper[4693]: I1125 12:22:38.765768 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tffpb" event={"ID":"30914703-c614-44c5-8add-1a131dfdd142","Type":"ContainerStarted","Data":"a87b2cadfd00d8eb1fa861d080975b95a91ba2fd4abcabb073396022d2f6d7cf"} Nov 25 12:22:39 crc kubenswrapper[4693]: I1125 12:22:39.773975 4693 generic.go:334] "Generic (PLEG): container finished" podID="30914703-c614-44c5-8add-1a131dfdd142" containerID="a87b2cadfd00d8eb1fa861d080975b95a91ba2fd4abcabb073396022d2f6d7cf" exitCode=0 Nov 25 12:22:39 crc kubenswrapper[4693]: I1125 12:22:39.774030 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tffpb" event={"ID":"30914703-c614-44c5-8add-1a131dfdd142","Type":"ContainerDied","Data":"a87b2cadfd00d8eb1fa861d080975b95a91ba2fd4abcabb073396022d2f6d7cf"} Nov 25 12:22:41 crc kubenswrapper[4693]: I1125 12:22:41.786754 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tffpb" event={"ID":"30914703-c614-44c5-8add-1a131dfdd142","Type":"ContainerStarted","Data":"730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f"} Nov 25 12:22:41 crc kubenswrapper[4693]: I1125 12:22:41.806342 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tffpb" podStartSLOduration=2.692556753 podStartE2EDuration="6.806322404s" podCreationTimestamp="2025-11-25 12:22:35 +0000 UTC" firstStartedPulling="2025-11-25 12:22:36.753435492 +0000 UTC m=+876.671520873" lastFinishedPulling="2025-11-25 12:22:40.867201133 +0000 UTC m=+880.785286524" observedRunningTime="2025-11-25 12:22:41.802900937 +0000 UTC m=+881.720986318" watchObservedRunningTime="2025-11-25 12:22:41.806322404 +0000 UTC m=+881.724407805" Nov 25 12:22:41 crc kubenswrapper[4693]: I1125 12:22:41.982050 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" Nov 25 12:22:44 crc kubenswrapper[4693]: I1125 12:22:44.720217 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:44 crc kubenswrapper[4693]: I1125 12:22:44.720536 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:44 crc kubenswrapper[4693]: I1125 12:22:44.771835 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:44 crc kubenswrapper[4693]: I1125 12:22:44.849517 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:45 crc kubenswrapper[4693]: I1125 12:22:45.011979 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-24r56"] Nov 25 12:22:45 crc kubenswrapper[4693]: I1125 12:22:45.547428 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:45 crc kubenswrapper[4693]: I1125 12:22:45.547527 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:46 crc kubenswrapper[4693]: I1125 12:22:46.625268 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tffpb" podUID="30914703-c614-44c5-8add-1a131dfdd142" containerName="registry-server" probeResult="failure" output=< Nov 25 12:22:46 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 12:22:46 crc kubenswrapper[4693]: > Nov 25 12:22:46 crc kubenswrapper[4693]: I1125 12:22:46.813754 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-24r56" podUID="64739058-feac-484c-bc60-3f15017a67d9" containerName="registry-server" containerID="cri-o://a3a87e6fa22763971a2d0fac3384cf1d6f60e3dbd16ad5d71be6b22e6cbbbac0" gracePeriod=2 Nov 25 12:22:50 crc kubenswrapper[4693]: I1125 12:22:50.841221 4693 generic.go:334] "Generic (PLEG): container finished" podID="64739058-feac-484c-bc60-3f15017a67d9" containerID="a3a87e6fa22763971a2d0fac3384cf1d6f60e3dbd16ad5d71be6b22e6cbbbac0" exitCode=0 Nov 25 12:22:50 crc kubenswrapper[4693]: I1125 12:22:50.841273 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24r56" event={"ID":"64739058-feac-484c-bc60-3f15017a67d9","Type":"ContainerDied","Data":"a3a87e6fa22763971a2d0fac3384cf1d6f60e3dbd16ad5d71be6b22e6cbbbac0"} Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.015796 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.127173 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6jrm\" (UniqueName: \"kubernetes.io/projected/64739058-feac-484c-bc60-3f15017a67d9-kube-api-access-j6jrm\") pod \"64739058-feac-484c-bc60-3f15017a67d9\" (UID: \"64739058-feac-484c-bc60-3f15017a67d9\") " Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.127242 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64739058-feac-484c-bc60-3f15017a67d9-utilities\") pod \"64739058-feac-484c-bc60-3f15017a67d9\" (UID: \"64739058-feac-484c-bc60-3f15017a67d9\") " Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.127421 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64739058-feac-484c-bc60-3f15017a67d9-catalog-content\") pod \"64739058-feac-484c-bc60-3f15017a67d9\" (UID: \"64739058-feac-484c-bc60-3f15017a67d9\") " Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.128120 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64739058-feac-484c-bc60-3f15017a67d9-utilities" (OuterVolumeSpecName: "utilities") pod "64739058-feac-484c-bc60-3f15017a67d9" (UID: "64739058-feac-484c-bc60-3f15017a67d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.137592 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64739058-feac-484c-bc60-3f15017a67d9-kube-api-access-j6jrm" (OuterVolumeSpecName: "kube-api-access-j6jrm") pod "64739058-feac-484c-bc60-3f15017a67d9" (UID: "64739058-feac-484c-bc60-3f15017a67d9"). InnerVolumeSpecName "kube-api-access-j6jrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.186779 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64739058-feac-484c-bc60-3f15017a67d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64739058-feac-484c-bc60-3f15017a67d9" (UID: "64739058-feac-484c-bc60-3f15017a67d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.229069 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64739058-feac-484c-bc60-3f15017a67d9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.229106 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6jrm\" (UniqueName: \"kubernetes.io/projected/64739058-feac-484c-bc60-3f15017a67d9-kube-api-access-j6jrm\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.229121 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64739058-feac-484c-bc60-3f15017a67d9-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.850453 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-24r56" Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.850446 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-24r56" event={"ID":"64739058-feac-484c-bc60-3f15017a67d9","Type":"ContainerDied","Data":"df0691f0e8df51a47f026c24285f8e2f7d7ae4bc5e325036a3750ad4a0c2c2fe"} Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.850603 4693 scope.go:117] "RemoveContainer" containerID="a3a87e6fa22763971a2d0fac3384cf1d6f60e3dbd16ad5d71be6b22e6cbbbac0" Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.869537 4693 scope.go:117] "RemoveContainer" containerID="b45de559466efd9e9ade004f3c095ed8cf17d1c03218eef948eda6c78cd21e24" Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.876912 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-24r56"] Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.880932 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-24r56"] Nov 25 12:22:51 crc kubenswrapper[4693]: I1125 12:22:51.910864 4693 scope.go:117] "RemoveContainer" containerID="7f0c5031b8dbcebad647f66f69a7a1ff491b1d5acc938074b0306b1e10bd5d4b" Nov 25 12:22:52 crc kubenswrapper[4693]: I1125 12:22:52.831444 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64739058-feac-484c-bc60-3f15017a67d9" path="/var/lib/kubelet/pods/64739058-feac-484c-bc60-3f15017a67d9/volumes" Nov 25 12:22:55 crc kubenswrapper[4693]: I1125 12:22:55.615589 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:55 crc kubenswrapper[4693]: I1125 12:22:55.668429 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:55 crc kubenswrapper[4693]: I1125 12:22:55.856550 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tffpb"] Nov 25 12:22:56 crc kubenswrapper[4693]: I1125 12:22:56.895403 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tffpb" podUID="30914703-c614-44c5-8add-1a131dfdd142" containerName="registry-server" containerID="cri-o://730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f" gracePeriod=2 Nov 25 12:22:57 crc kubenswrapper[4693]: I1125 12:22:57.903557 4693 generic.go:334] "Generic (PLEG): container finished" podID="30914703-c614-44c5-8add-1a131dfdd142" containerID="730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f" exitCode=0 Nov 25 12:22:57 crc kubenswrapper[4693]: I1125 12:22:57.903655 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:57 crc kubenswrapper[4693]: I1125 12:22:57.903692 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tffpb" event={"ID":"30914703-c614-44c5-8add-1a131dfdd142","Type":"ContainerDied","Data":"730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f"} Nov 25 12:22:57 crc kubenswrapper[4693]: I1125 12:22:57.904175 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tffpb" event={"ID":"30914703-c614-44c5-8add-1a131dfdd142","Type":"ContainerDied","Data":"1e4f0c3cb2dc238119b6b491f4ce66321c26478b6071036dd1746737138ad989"} Nov 25 12:22:57 crc kubenswrapper[4693]: I1125 12:22:57.904203 4693 scope.go:117] "RemoveContainer" containerID="730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f" Nov 25 12:22:57 crc kubenswrapper[4693]: I1125 12:22:57.924186 4693 scope.go:117] "RemoveContainer" containerID="a87b2cadfd00d8eb1fa861d080975b95a91ba2fd4abcabb073396022d2f6d7cf" Nov 25 12:22:57 crc kubenswrapper[4693]: I1125 12:22:57.992771 4693 scope.go:117] "RemoveContainer" containerID="f84ae78822ac89d244de04741941b55d20ef57cd9ce6d7e6afb5db66d5d63b48" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.015086 4693 scope.go:117] "RemoveContainer" containerID="730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f" Nov 25 12:22:58 crc kubenswrapper[4693]: E1125 12:22:58.015595 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f\": container with ID starting with 730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f not found: ID does not exist" containerID="730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.015630 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f"} err="failed to get container status \"730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f\": rpc error: code = NotFound desc = could not find container \"730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f\": container with ID starting with 730ed83125f439efbc0e4f27bc2315e76a72a22216ee11285d18bfd2a9e9354f not found: ID does not exist" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.015657 4693 scope.go:117] "RemoveContainer" containerID="a87b2cadfd00d8eb1fa861d080975b95a91ba2fd4abcabb073396022d2f6d7cf" Nov 25 12:22:58 crc kubenswrapper[4693]: E1125 12:22:58.015937 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a87b2cadfd00d8eb1fa861d080975b95a91ba2fd4abcabb073396022d2f6d7cf\": container with ID starting with a87b2cadfd00d8eb1fa861d080975b95a91ba2fd4abcabb073396022d2f6d7cf not found: ID does not exist" containerID="a87b2cadfd00d8eb1fa861d080975b95a91ba2fd4abcabb073396022d2f6d7cf" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.015964 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a87b2cadfd00d8eb1fa861d080975b95a91ba2fd4abcabb073396022d2f6d7cf"} err="failed to get container status \"a87b2cadfd00d8eb1fa861d080975b95a91ba2fd4abcabb073396022d2f6d7cf\": rpc error: code = NotFound desc = could not find container \"a87b2cadfd00d8eb1fa861d080975b95a91ba2fd4abcabb073396022d2f6d7cf\": container with ID starting with a87b2cadfd00d8eb1fa861d080975b95a91ba2fd4abcabb073396022d2f6d7cf not found: ID does not exist" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.015982 4693 scope.go:117] "RemoveContainer" containerID="f84ae78822ac89d244de04741941b55d20ef57cd9ce6d7e6afb5db66d5d63b48" Nov 25 12:22:58 crc kubenswrapper[4693]: E1125 12:22:58.016226 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84ae78822ac89d244de04741941b55d20ef57cd9ce6d7e6afb5db66d5d63b48\": container with ID starting with f84ae78822ac89d244de04741941b55d20ef57cd9ce6d7e6afb5db66d5d63b48 not found: ID does not exist" containerID="f84ae78822ac89d244de04741941b55d20ef57cd9ce6d7e6afb5db66d5d63b48" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.016253 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84ae78822ac89d244de04741941b55d20ef57cd9ce6d7e6afb5db66d5d63b48"} err="failed to get container status \"f84ae78822ac89d244de04741941b55d20ef57cd9ce6d7e6afb5db66d5d63b48\": rpc error: code = NotFound desc = could not find container \"f84ae78822ac89d244de04741941b55d20ef57cd9ce6d7e6afb5db66d5d63b48\": container with ID starting with f84ae78822ac89d244de04741941b55d20ef57cd9ce6d7e6afb5db66d5d63b48 not found: ID does not exist" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.016871 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30914703-c614-44c5-8add-1a131dfdd142-utilities\") pod \"30914703-c614-44c5-8add-1a131dfdd142\" (UID: \"30914703-c614-44c5-8add-1a131dfdd142\") " Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.017563 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30914703-c614-44c5-8add-1a131dfdd142-catalog-content\") pod \"30914703-c614-44c5-8add-1a131dfdd142\" (UID: \"30914703-c614-44c5-8add-1a131dfdd142\") " Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.017614 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r499v\" (UniqueName: \"kubernetes.io/projected/30914703-c614-44c5-8add-1a131dfdd142-kube-api-access-r499v\") pod \"30914703-c614-44c5-8add-1a131dfdd142\" (UID: \"30914703-c614-44c5-8add-1a131dfdd142\") " Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.017649 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30914703-c614-44c5-8add-1a131dfdd142-utilities" (OuterVolumeSpecName: "utilities") pod "30914703-c614-44c5-8add-1a131dfdd142" (UID: "30914703-c614-44c5-8add-1a131dfdd142"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.017873 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30914703-c614-44c5-8add-1a131dfdd142-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.027605 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30914703-c614-44c5-8add-1a131dfdd142-kube-api-access-r499v" (OuterVolumeSpecName: "kube-api-access-r499v") pod "30914703-c614-44c5-8add-1a131dfdd142" (UID: "30914703-c614-44c5-8add-1a131dfdd142"). InnerVolumeSpecName "kube-api-access-r499v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.119435 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r499v\" (UniqueName: \"kubernetes.io/projected/30914703-c614-44c5-8add-1a131dfdd142-kube-api-access-r499v\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.131556 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30914703-c614-44c5-8add-1a131dfdd142-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30914703-c614-44c5-8add-1a131dfdd142" (UID: "30914703-c614-44c5-8add-1a131dfdd142"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.221411 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30914703-c614-44c5-8add-1a131dfdd142-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.811878 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj"] Nov 25 12:22:58 crc kubenswrapper[4693]: E1125 12:22:58.812431 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64739058-feac-484c-bc60-3f15017a67d9" containerName="extract-content" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.812450 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="64739058-feac-484c-bc60-3f15017a67d9" containerName="extract-content" Nov 25 12:22:58 crc kubenswrapper[4693]: E1125 12:22:58.812469 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30914703-c614-44c5-8add-1a131dfdd142" containerName="registry-server" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.812478 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="30914703-c614-44c5-8add-1a131dfdd142" containerName="registry-server" Nov 25 12:22:58 crc kubenswrapper[4693]: E1125 12:22:58.812489 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64739058-feac-484c-bc60-3f15017a67d9" containerName="extract-utilities" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.812497 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="64739058-feac-484c-bc60-3f15017a67d9" containerName="extract-utilities" Nov 25 12:22:58 crc kubenswrapper[4693]: E1125 12:22:58.812508 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30914703-c614-44c5-8add-1a131dfdd142" containerName="extract-content" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.812515 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="30914703-c614-44c5-8add-1a131dfdd142" containerName="extract-content" Nov 25 12:22:58 crc kubenswrapper[4693]: E1125 12:22:58.812529 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64739058-feac-484c-bc60-3f15017a67d9" containerName="registry-server" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.812537 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="64739058-feac-484c-bc60-3f15017a67d9" containerName="registry-server" Nov 25 12:22:58 crc kubenswrapper[4693]: E1125 12:22:58.812549 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30914703-c614-44c5-8add-1a131dfdd142" containerName="extract-utilities" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.812555 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="30914703-c614-44c5-8add-1a131dfdd142" containerName="extract-utilities" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.812692 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="30914703-c614-44c5-8add-1a131dfdd142" containerName="registry-server" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.812707 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="64739058-feac-484c-bc60-3f15017a67d9" containerName="registry-server" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.813583 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.816688 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4cpkn" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.825002 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj"] Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.832219 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v"] Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.833402 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.854468 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-dxsx4" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.859439 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v"] Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.873826 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6"] Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.875068 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.883358 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-9jrzc" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.886290 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-68b95954c9-866fd"] Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.889313 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.899767 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-h7bgw" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.906772 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6"] Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.919439 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68b95954c9-866fd"] Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.922330 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tffpb" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.929842 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-774b86978c-nzz29"] Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.931174 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.932018 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n87p\" (UniqueName: \"kubernetes.io/projected/9cc5c4a9-0119-48b6-a795-9f482b55278b-kube-api-access-9n87p\") pod \"designate-operator-controller-manager-7d695c9b56-6dtx6\" (UID: \"9cc5c4a9-0119-48b6-a795-9f482b55278b\") " pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.932165 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4v48\" (UniqueName: \"kubernetes.io/projected/2f11c884-15fc-4e2a-a533-d0eac0639f80-kube-api-access-x4v48\") pod \"barbican-operator-controller-manager-86dc4d89c8-6wxtj\" (UID: \"2f11c884-15fc-4e2a-a533-d0eac0639f80\") " pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.932217 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smwnw\" (UniqueName: \"kubernetes.io/projected/4ab70f55-282f-4509-bc36-71ef2fe4d35b-kube-api-access-smwnw\") pod \"cinder-operator-controller-manager-79856dc55c-4lt8v\" (UID: \"4ab70f55-282f-4509-bc36-71ef2fe4d35b\") " pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.935063 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vx9wx" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.946264 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj"] Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.948864 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.954029 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-spb4r" Nov 25 12:22:58 crc kubenswrapper[4693]: I1125 12:22:58.981887 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-774b86978c-nzz29"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.036830 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv9nf\" (UniqueName: \"kubernetes.io/projected/b29c9c21-026a-4701-99a7-769d382a2da2-kube-api-access-pv9nf\") pod \"heat-operator-controller-manager-774b86978c-nzz29\" (UID: \"b29c9c21-026a-4701-99a7-769d382a2da2\") " pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.036915 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n87p\" (UniqueName: \"kubernetes.io/projected/9cc5c4a9-0119-48b6-a795-9f482b55278b-kube-api-access-9n87p\") pod \"designate-operator-controller-manager-7d695c9b56-6dtx6\" (UID: \"9cc5c4a9-0119-48b6-a795-9f482b55278b\") " pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.037007 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8js7\" (UniqueName: \"kubernetes.io/projected/7cb65a4e-3294-4104-b3bf-6d1103b92c38-kube-api-access-q8js7\") pod \"glance-operator-controller-manager-68b95954c9-866fd\" (UID: \"7cb65a4e-3294-4104-b3bf-6d1103b92c38\") " pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.037059 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrj5x\" (UniqueName: \"kubernetes.io/projected/4dd9cd53-1f66-4636-9fab-9f0b3ff38009-kube-api-access-hrj5x\") pod \"horizon-operator-controller-manager-68c9694994-fwwsj\" (UID: \"4dd9cd53-1f66-4636-9fab-9f0b3ff38009\") " pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.037088 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4v48\" (UniqueName: \"kubernetes.io/projected/2f11c884-15fc-4e2a-a533-d0eac0639f80-kube-api-access-x4v48\") pod \"barbican-operator-controller-manager-86dc4d89c8-6wxtj\" (UID: \"2f11c884-15fc-4e2a-a533-d0eac0639f80\") " pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.037129 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smwnw\" (UniqueName: \"kubernetes.io/projected/4ab70f55-282f-4509-bc36-71ef2fe4d35b-kube-api-access-smwnw\") pod \"cinder-operator-controller-manager-79856dc55c-4lt8v\" (UID: \"4ab70f55-282f-4509-bc36-71ef2fe4d35b\") " pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.069046 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.084467 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4v48\" (UniqueName: \"kubernetes.io/projected/2f11c884-15fc-4e2a-a533-d0eac0639f80-kube-api-access-x4v48\") pod \"barbican-operator-controller-manager-86dc4d89c8-6wxtj\" (UID: \"2f11c884-15fc-4e2a-a533-d0eac0639f80\") " pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.096668 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n87p\" (UniqueName: \"kubernetes.io/projected/9cc5c4a9-0119-48b6-a795-9f482b55278b-kube-api-access-9n87p\") pod \"designate-operator-controller-manager-7d695c9b56-6dtx6\" (UID: \"9cc5c4a9-0119-48b6-a795-9f482b55278b\") " pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.096730 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.097763 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.102030 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-t8gn6" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.102256 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.105018 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smwnw\" (UniqueName: \"kubernetes.io/projected/4ab70f55-282f-4509-bc36-71ef2fe4d35b-kube-api-access-smwnw\") pod \"cinder-operator-controller-manager-79856dc55c-4lt8v\" (UID: \"4ab70f55-282f-4509-bc36-71ef2fe4d35b\") " pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.108424 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.109568 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.114342 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.120080 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-bn5rr" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.126439 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.134125 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.145844 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.146936 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.152093 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-9jjvz" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.152512 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c98082e-070e-42b1-afdc-69cea132629e-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-r86ct\" (UID: \"5c98082e-070e-42b1-afdc-69cea132629e\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.152601 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8js7\" (UniqueName: \"kubernetes.io/projected/7cb65a4e-3294-4104-b3bf-6d1103b92c38-kube-api-access-q8js7\") pod \"glance-operator-controller-manager-68b95954c9-866fd\" (UID: \"7cb65a4e-3294-4104-b3bf-6d1103b92c38\") " pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.152645 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrj5x\" (UniqueName: \"kubernetes.io/projected/4dd9cd53-1f66-4636-9fab-9f0b3ff38009-kube-api-access-hrj5x\") pod \"horizon-operator-controller-manager-68c9694994-fwwsj\" (UID: \"4dd9cd53-1f66-4636-9fab-9f0b3ff38009\") " pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.152680 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxn6j\" (UniqueName: \"kubernetes.io/projected/5c98082e-070e-42b1-afdc-69cea132629e-kube-api-access-rxn6j\") pod \"infra-operator-controller-manager-d5cc86f4b-r86ct\" (UID: \"5c98082e-070e-42b1-afdc-69cea132629e\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.152734 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv9nf\" (UniqueName: \"kubernetes.io/projected/b29c9c21-026a-4701-99a7-769d382a2da2-kube-api-access-pv9nf\") pod \"heat-operator-controller-manager-774b86978c-nzz29\" (UID: \"b29c9c21-026a-4701-99a7-769d382a2da2\") " pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.153191 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.162225 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tffpb"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.181626 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tffpb"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.195800 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.206120 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv9nf\" (UniqueName: \"kubernetes.io/projected/b29c9c21-026a-4701-99a7-769d382a2da2-kube-api-access-pv9nf\") pod \"heat-operator-controller-manager-774b86978c-nzz29\" (UID: \"b29c9c21-026a-4701-99a7-769d382a2da2\") " pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.211677 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.228092 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8js7\" (UniqueName: \"kubernetes.io/projected/7cb65a4e-3294-4104-b3bf-6d1103b92c38-kube-api-access-q8js7\") pod \"glance-operator-controller-manager-68b95954c9-866fd\" (UID: \"7cb65a4e-3294-4104-b3bf-6d1103b92c38\") " pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.228653 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrj5x\" (UniqueName: \"kubernetes.io/projected/4dd9cd53-1f66-4636-9fab-9f0b3ff38009-kube-api-access-hrj5x\") pod \"horizon-operator-controller-manager-68c9694994-fwwsj\" (UID: \"4dd9cd53-1f66-4636-9fab-9f0b3ff38009\") " pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.228703 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.229833 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.244919 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-hbbx6" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.253481 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmf9n\" (UniqueName: \"kubernetes.io/projected/3c29e8b9-57cf-4967-b5e2-a6af42c16099-kube-api-access-hmf9n\") pod \"ironic-operator-controller-manager-5bfcdc958c-szrv4\" (UID: \"3c29e8b9-57cf-4967-b5e2-a6af42c16099\") " pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.253552 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxn6j\" (UniqueName: \"kubernetes.io/projected/5c98082e-070e-42b1-afdc-69cea132629e-kube-api-access-rxn6j\") pod \"infra-operator-controller-manager-d5cc86f4b-r86ct\" (UID: \"5c98082e-070e-42b1-afdc-69cea132629e\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.256841 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khwc5\" (UniqueName: \"kubernetes.io/projected/a64b0f5c-e6af-4903-925a-028aec5477fd-kube-api-access-khwc5\") pod \"keystone-operator-controller-manager-748dc6576f-zcpsz\" (UID: \"a64b0f5c-e6af-4903-925a-028aec5477fd\") " pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.256891 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c98082e-070e-42b1-afdc-69cea132629e-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-r86ct\" (UID: \"5c98082e-070e-42b1-afdc-69cea132629e\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.254035 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.262735 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c98082e-070e-42b1-afdc-69cea132629e-cert\") pod \"infra-operator-controller-manager-d5cc86f4b-r86ct\" (UID: \"5c98082e-070e-42b1-afdc-69cea132629e\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.278283 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.281272 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.290722 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxn6j\" (UniqueName: \"kubernetes.io/projected/5c98082e-070e-42b1-afdc-69cea132629e-kube-api-access-rxn6j\") pod \"infra-operator-controller-manager-d5cc86f4b-r86ct\" (UID: \"5c98082e-070e-42b1-afdc-69cea132629e\") " pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.291014 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-c9m8g" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.296682 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.300269 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.320534 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.328651 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.332581 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.335771 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-j2j9t" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.349889 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.351200 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.353412 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-q867m" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.358294 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq974\" (UniqueName: \"kubernetes.io/projected/bfeee7c1-207f-4862-b172-f2ffab4a1500-kube-api-access-dq974\") pod \"manila-operator-controller-manager-58bb8d67cc-5ghnq\" (UID: \"bfeee7c1-207f-4862-b172-f2ffab4a1500\") " pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.358337 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khwc5\" (UniqueName: \"kubernetes.io/projected/a64b0f5c-e6af-4903-925a-028aec5477fd-kube-api-access-khwc5\") pod \"keystone-operator-controller-manager-748dc6576f-zcpsz\" (UID: \"a64b0f5c-e6af-4903-925a-028aec5477fd\") " pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.358408 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmf9n\" (UniqueName: \"kubernetes.io/projected/3c29e8b9-57cf-4967-b5e2-a6af42c16099-kube-api-access-hmf9n\") pod \"ironic-operator-controller-manager-5bfcdc958c-szrv4\" (UID: \"3c29e8b9-57cf-4967-b5e2-a6af42c16099\") " pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.358503 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx5gs\" (UniqueName: \"kubernetes.io/projected/22a83ecc-1f72-4474-a470-2ee4bef7eddf-kube-api-access-tx5gs\") pod \"mariadb-operator-controller-manager-cb6c4fdb7-s9shw\" (UID: \"22a83ecc-1f72-4474-a470-2ee4bef7eddf\") " pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.370348 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.379794 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khwc5\" (UniqueName: \"kubernetes.io/projected/a64b0f5c-e6af-4903-925a-028aec5477fd-kube-api-access-khwc5\") pod \"keystone-operator-controller-manager-748dc6576f-zcpsz\" (UID: \"a64b0f5c-e6af-4903-925a-028aec5477fd\") " pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.383980 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.388809 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmf9n\" (UniqueName: \"kubernetes.io/projected/3c29e8b9-57cf-4967-b5e2-a6af42c16099-kube-api-access-hmf9n\") pod \"ironic-operator-controller-manager-5bfcdc958c-szrv4\" (UID: \"3c29e8b9-57cf-4967-b5e2-a6af42c16099\") " pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.402532 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.403853 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.413639 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-slsbp" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.413810 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.443776 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.447469 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.450794 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.452107 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.452730 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.459705 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-lbvsr" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.459891 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-62xn5" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.461224 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7c4eb9b-38af-41da-872e-b3da515b2f88-cert\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-jlbhg\" (UID: \"a7c4eb9b-38af-41da-872e-b3da515b2f88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.461303 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btr47\" (UniqueName: \"kubernetes.io/projected/7ecc8c23-d9b2-4d46-a8b0-76758035b267-kube-api-access-btr47\") pod \"nova-operator-controller-manager-79556f57fc-flxdz\" (UID: \"7ecc8c23-d9b2-4d46-a8b0-76758035b267\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.461418 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx5gs\" (UniqueName: \"kubernetes.io/projected/22a83ecc-1f72-4474-a470-2ee4bef7eddf-kube-api-access-tx5gs\") pod \"mariadb-operator-controller-manager-cb6c4fdb7-s9shw\" (UID: \"22a83ecc-1f72-4474-a470-2ee4bef7eddf\") " pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.461464 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq974\" (UniqueName: \"kubernetes.io/projected/bfeee7c1-207f-4862-b172-f2ffab4a1500-kube-api-access-dq974\") pod \"manila-operator-controller-manager-58bb8d67cc-5ghnq\" (UID: \"bfeee7c1-207f-4862-b172-f2ffab4a1500\") " pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.461536 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pklr\" (UniqueName: \"kubernetes.io/projected/a7c4eb9b-38af-41da-872e-b3da515b2f88-kube-api-access-4pklr\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-jlbhg\" (UID: \"a7c4eb9b-38af-41da-872e-b3da515b2f88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.461579 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sglkk\" (UniqueName: \"kubernetes.io/projected/0f35f544-581e-4cb2-900f-71213e27477d-kube-api-access-sglkk\") pod \"neutron-operator-controller-manager-7c57c8bbc4-csrpt\" (UID: \"0f35f544-581e-4cb2-900f-71213e27477d\") " pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.471708 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.482164 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx5gs\" (UniqueName: \"kubernetes.io/projected/22a83ecc-1f72-4474-a470-2ee4bef7eddf-kube-api-access-tx5gs\") pod \"mariadb-operator-controller-manager-cb6c4fdb7-s9shw\" (UID: \"22a83ecc-1f72-4474-a470-2ee4bef7eddf\") " pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.500959 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.513183 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.521236 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq974\" (UniqueName: \"kubernetes.io/projected/bfeee7c1-207f-4862-b172-f2ffab4a1500-kube-api-access-dq974\") pod \"manila-operator-controller-manager-58bb8d67cc-5ghnq\" (UID: \"bfeee7c1-207f-4862-b172-f2ffab4a1500\") " pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.527569 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.530434 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.533656 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-8gxsw" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.556322 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.559251 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.566389 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-sh496" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.569710 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.567362 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pklr\" (UniqueName: \"kubernetes.io/projected/a7c4eb9b-38af-41da-872e-b3da515b2f88-kube-api-access-4pklr\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-jlbhg\" (UID: \"a7c4eb9b-38af-41da-872e-b3da515b2f88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.574423 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sglkk\" (UniqueName: \"kubernetes.io/projected/0f35f544-581e-4cb2-900f-71213e27477d-kube-api-access-sglkk\") pod \"neutron-operator-controller-manager-7c57c8bbc4-csrpt\" (UID: \"0f35f544-581e-4cb2-900f-71213e27477d\") " pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.574464 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrdnw\" (UniqueName: \"kubernetes.io/projected/1c7db975-17d7-48dd-8e5a-0549749ab866-kube-api-access-qrdnw\") pod \"ovn-operator-controller-manager-66cf5c67ff-k2njb\" (UID: \"1c7db975-17d7-48dd-8e5a-0549749ab866\") " pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.574508 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mqmt\" (UniqueName: \"kubernetes.io/projected/f6bc1c64-200f-492f-bad9-dfecd5687698-kube-api-access-9mqmt\") pod \"placement-operator-controller-manager-5db546f9d9-f4trp\" (UID: \"f6bc1c64-200f-492f-bad9-dfecd5687698\") " pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.574535 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7c4eb9b-38af-41da-872e-b3da515b2f88-cert\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-jlbhg\" (UID: \"a7c4eb9b-38af-41da-872e-b3da515b2f88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.574654 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btr47\" (UniqueName: \"kubernetes.io/projected/7ecc8c23-d9b2-4d46-a8b0-76758035b267-kube-api-access-btr47\") pod \"nova-operator-controller-manager-79556f57fc-flxdz\" (UID: \"7ecc8c23-d9b2-4d46-a8b0-76758035b267\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.574689 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w88f7\" (UniqueName: \"kubernetes.io/projected/fe2a0074-66dc-4730-9321-772ee8fd8e28-kube-api-access-w88f7\") pod \"octavia-operator-controller-manager-fd75fd47d-g972v\" (UID: \"fe2a0074-66dc-4730-9321-772ee8fd8e28\") " pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" Nov 25 12:22:59 crc kubenswrapper[4693]: E1125 12:22:59.574893 4693 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 12:22:59 crc kubenswrapper[4693]: E1125 12:22:59.574947 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c4eb9b-38af-41da-872e-b3da515b2f88-cert podName:a7c4eb9b-38af-41da-872e-b3da515b2f88 nodeName:}" failed. No retries permitted until 2025-11-25 12:23:00.0749266 +0000 UTC m=+899.993011981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7c4eb9b-38af-41da-872e-b3da515b2f88-cert") pod "openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" (UID: "a7c4eb9b-38af-41da-872e-b3da515b2f88") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.583075 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.590196 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.599696 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pklr\" (UniqueName: \"kubernetes.io/projected/a7c4eb9b-38af-41da-872e-b3da515b2f88-kube-api-access-4pklr\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-jlbhg\" (UID: \"a7c4eb9b-38af-41da-872e-b3da515b2f88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.614722 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.616286 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.618578 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.626813 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-5n4d9" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.630087 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sglkk\" (UniqueName: \"kubernetes.io/projected/0f35f544-581e-4cb2-900f-71213e27477d-kube-api-access-sglkk\") pod \"neutron-operator-controller-manager-7c57c8bbc4-csrpt\" (UID: \"0f35f544-581e-4cb2-900f-71213e27477d\") " pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.637632 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.647556 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btr47\" (UniqueName: \"kubernetes.io/projected/7ecc8c23-d9b2-4d46-a8b0-76758035b267-kube-api-access-btr47\") pod \"nova-operator-controller-manager-79556f57fc-flxdz\" (UID: \"7ecc8c23-d9b2-4d46-a8b0-76758035b267\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.652438 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.685119 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.688397 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.704827 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w88f7\" (UniqueName: \"kubernetes.io/projected/fe2a0074-66dc-4730-9321-772ee8fd8e28-kube-api-access-w88f7\") pod \"octavia-operator-controller-manager-fd75fd47d-g972v\" (UID: \"fe2a0074-66dc-4730-9321-772ee8fd8e28\") " pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.704904 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb594\" (UniqueName: \"kubernetes.io/projected/b9227546-dcce-4b09-9311-19f844deb318-kube-api-access-gb594\") pod \"telemetry-operator-controller-manager-567f98c9d-cwrvs\" (UID: \"b9227546-dcce-4b09-9311-19f844deb318\") " pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.705328 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrdnw\" (UniqueName: \"kubernetes.io/projected/1c7db975-17d7-48dd-8e5a-0549749ab866-kube-api-access-qrdnw\") pod \"ovn-operator-controller-manager-66cf5c67ff-k2njb\" (UID: \"1c7db975-17d7-48dd-8e5a-0549749ab866\") " pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.705558 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mqmt\" (UniqueName: \"kubernetes.io/projected/f6bc1c64-200f-492f-bad9-dfecd5687698-kube-api-access-9mqmt\") pod \"placement-operator-controller-manager-5db546f9d9-f4trp\" (UID: \"f6bc1c64-200f-492f-bad9-dfecd5687698\") " pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.705623 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kknr\" (UniqueName: \"kubernetes.io/projected/c3a7c8cb-ac3c-43d3-b38d-0c3625c53196-kube-api-access-6kknr\") pod \"swift-operator-controller-manager-6fdc4fcf86-bnf27\" (UID: \"c3a7c8cb-ac3c-43d3-b38d-0c3625c53196\") " pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.737856 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mqmt\" (UniqueName: \"kubernetes.io/projected/f6bc1c64-200f-492f-bad9-dfecd5687698-kube-api-access-9mqmt\") pod \"placement-operator-controller-manager-5db546f9d9-f4trp\" (UID: \"f6bc1c64-200f-492f-bad9-dfecd5687698\") " pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.759709 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.779187 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w88f7\" (UniqueName: \"kubernetes.io/projected/fe2a0074-66dc-4730-9321-772ee8fd8e28-kube-api-access-w88f7\") pod \"octavia-operator-controller-manager-fd75fd47d-g972v\" (UID: \"fe2a0074-66dc-4730-9321-772ee8fd8e28\") " pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.783012 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrdnw\" (UniqueName: \"kubernetes.io/projected/1c7db975-17d7-48dd-8e5a-0549749ab866-kube-api-access-qrdnw\") pod \"ovn-operator-controller-manager-66cf5c67ff-k2njb\" (UID: \"1c7db975-17d7-48dd-8e5a-0549749ab866\") " pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.835696 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kknr\" (UniqueName: \"kubernetes.io/projected/c3a7c8cb-ac3c-43d3-b38d-0c3625c53196-kube-api-access-6kknr\") pod \"swift-operator-controller-manager-6fdc4fcf86-bnf27\" (UID: \"c3a7c8cb-ac3c-43d3-b38d-0c3625c53196\") " pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.835770 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb594\" (UniqueName: \"kubernetes.io/projected/b9227546-dcce-4b09-9311-19f844deb318-kube-api-access-gb594\") pod \"telemetry-operator-controller-manager-567f98c9d-cwrvs\" (UID: \"b9227546-dcce-4b09-9311-19f844deb318\") " pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.842799 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.843392 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.855036 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.895441 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kknr\" (UniqueName: \"kubernetes.io/projected/c3a7c8cb-ac3c-43d3-b38d-0c3625c53196-kube-api-access-6kknr\") pod \"swift-operator-controller-manager-6fdc4fcf86-bnf27\" (UID: \"c3a7c8cb-ac3c-43d3-b38d-0c3625c53196\") " pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.895761 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.898241 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb594\" (UniqueName: \"kubernetes.io/projected/b9227546-dcce-4b09-9311-19f844deb318-kube-api-access-gb594\") pod \"telemetry-operator-controller-manager-567f98c9d-cwrvs\" (UID: \"b9227546-dcce-4b09-9311-19f844deb318\") " pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.924010 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8"] Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.925290 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.930090 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.939738 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mgh26" Nov 25 12:22:59 crc kubenswrapper[4693]: I1125 12:22:59.949982 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.039273 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjkfs\" (UniqueName: \"kubernetes.io/projected/ef0b302b-05d0-4be3-85ad-7eb3d70cec36-kube-api-access-vjkfs\") pod \"test-operator-controller-manager-5cb74df96-kmpm8\" (UID: \"ef0b302b-05d0-4be3-85ad-7eb3d70cec36\") " pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.043586 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-864885998-tc9jb"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.045036 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.049992 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-nt867" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.061204 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-864885998-tc9jb"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.120193 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.121588 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.127712 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.127969 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.128192 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5mjsr" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.140191 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjkfs\" (UniqueName: \"kubernetes.io/projected/ef0b302b-05d0-4be3-85ad-7eb3d70cec36-kube-api-access-vjkfs\") pod \"test-operator-controller-manager-5cb74df96-kmpm8\" (UID: \"ef0b302b-05d0-4be3-85ad-7eb3d70cec36\") " pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.140338 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7c4eb9b-38af-41da-872e-b3da515b2f88-cert\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-jlbhg\" (UID: \"a7c4eb9b-38af-41da-872e-b3da515b2f88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.140416 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpv8s\" (UniqueName: \"kubernetes.io/projected/105791fd-407d-44a3-8fc8-af90e82b0f63-kube-api-access-wpv8s\") pod \"watcher-operator-controller-manager-864885998-tc9jb\" (UID: \"105791fd-407d-44a3-8fc8-af90e82b0f63\") " pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.140864 4693 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.140921 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c4eb9b-38af-41da-872e-b3da515b2f88-cert podName:a7c4eb9b-38af-41da-872e-b3da515b2f88 nodeName:}" failed. No retries permitted until 2025-11-25 12:23:01.140904284 +0000 UTC m=+901.058989665 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7c4eb9b-38af-41da-872e-b3da515b2f88-cert") pod "openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" (UID: "a7c4eb9b-38af-41da-872e-b3da515b2f88") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.145724 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.159010 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.159980 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.162605 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-t2wvn" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.171016 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.177472 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.184259 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjkfs\" (UniqueName: \"kubernetes.io/projected/ef0b302b-05d0-4be3-85ad-7eb3d70cec36-kube-api-access-vjkfs\") pod \"test-operator-controller-manager-5cb74df96-kmpm8\" (UID: \"ef0b302b-05d0-4be3-85ad-7eb3d70cec36\") " pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.189673 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.241967 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-metrics-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.242299 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-webhook-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.242350 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2f2t\" (UniqueName: \"kubernetes.io/projected/c80a0f65-6193-435f-8138-eb5a4ba71b22-kube-api-access-x2f2t\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.242424 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq25c\" (UniqueName: \"kubernetes.io/projected/28782f20-4534-4137-b590-7a3b31c638b2-kube-api-access-zq25c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-qbjp2\" (UID: \"28782f20-4534-4137-b590-7a3b31c638b2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.242489 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpv8s\" (UniqueName: \"kubernetes.io/projected/105791fd-407d-44a3-8fc8-af90e82b0f63-kube-api-access-wpv8s\") pod \"watcher-operator-controller-manager-864885998-tc9jb\" (UID: \"105791fd-407d-44a3-8fc8-af90e82b0f63\") " pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.264947 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.272691 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpv8s\" (UniqueName: \"kubernetes.io/projected/105791fd-407d-44a3-8fc8-af90e82b0f63-kube-api-access-wpv8s\") pod \"watcher-operator-controller-manager-864885998-tc9jb\" (UID: \"105791fd-407d-44a3-8fc8-af90e82b0f63\") " pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.287904 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.339811 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.348586 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-774b86978c-nzz29"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.348650 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.355730 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-metrics-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.355771 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-webhook-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.355820 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2f2t\" (UniqueName: \"kubernetes.io/projected/c80a0f65-6193-435f-8138-eb5a4ba71b22-kube-api-access-x2f2t\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.355874 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq25c\" (UniqueName: \"kubernetes.io/projected/28782f20-4534-4137-b590-7a3b31c638b2-kube-api-access-zq25c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-qbjp2\" (UID: \"28782f20-4534-4137-b590-7a3b31c638b2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.356307 4693 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.356364 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-webhook-certs podName:c80a0f65-6193-435f-8138-eb5a4ba71b22 nodeName:}" failed. No retries permitted until 2025-11-25 12:23:00.856346304 +0000 UTC m=+900.774431685 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-webhook-certs") pod "openstack-operator-controller-manager-7cd5954d9-rqjq9" (UID: "c80a0f65-6193-435f-8138-eb5a4ba71b22") : secret "webhook-server-cert" not found Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.356437 4693 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.356519 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-metrics-certs podName:c80a0f65-6193-435f-8138-eb5a4ba71b22 nodeName:}" failed. No retries permitted until 2025-11-25 12:23:00.856496848 +0000 UTC m=+900.774582289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-metrics-certs") pod "openstack-operator-controller-manager-7cd5954d9-rqjq9" (UID: "c80a0f65-6193-435f-8138-eb5a4ba71b22") : secret "metrics-server-cert" not found Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.356655 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.366437 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.382616 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2f2t\" (UniqueName: \"kubernetes.io/projected/c80a0f65-6193-435f-8138-eb5a4ba71b22-kube-api-access-x2f2t\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:00 crc kubenswrapper[4693]: W1125 12:23:00.384028 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f11c884_15fc_4e2a_a533_d0eac0639f80.slice/crio-c2bb000e52ae3b3016a54d4aae21638162f1dd80536f2fa178254fca73f83a29 WatchSource:0}: Error finding container c2bb000e52ae3b3016a54d4aae21638162f1dd80536f2fa178254fca73f83a29: Status 404 returned error can't find the container with id c2bb000e52ae3b3016a54d4aae21638162f1dd80536f2fa178254fca73f83a29 Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.389536 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq25c\" (UniqueName: \"kubernetes.io/projected/28782f20-4534-4137-b590-7a3b31c638b2-kube-api-access-zq25c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-qbjp2\" (UID: \"28782f20-4534-4137-b590-7a3b31c638b2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" Nov 25 12:23:00 crc kubenswrapper[4693]: W1125 12:23:00.394825 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cc5c4a9_0119_48b6_a795_9f482b55278b.slice/crio-22ca659bf34ffb248224b156aa811659ef83312c03b8d51c450436f7c591ce67 WatchSource:0}: Error finding container 22ca659bf34ffb248224b156aa811659ef83312c03b8d51c450436f7c591ce67: Status 404 returned error can't find the container with id 22ca659bf34ffb248224b156aa811659ef83312c03b8d51c450436f7c591ce67 Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.565224 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.592822 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq"] Nov 25 12:23:00 crc kubenswrapper[4693]: W1125 12:23:00.610840 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfeee7c1_207f_4862_b172_f2ffab4a1500.slice/crio-6a961f123a33ee120610305e807c2d3a7ea04bbe7f1e7071d2ffd4efd05999a7 WatchSource:0}: Error finding container 6a961f123a33ee120610305e807c2d3a7ea04bbe7f1e7071d2ffd4efd05999a7: Status 404 returned error can't find the container with id 6a961f123a33ee120610305e807c2d3a7ea04bbe7f1e7071d2ffd4efd05999a7 Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.611495 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-68b95954c9-866fd"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.625731 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.626529 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.698807 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.718156 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.825935 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30914703-c614-44c5-8add-1a131dfdd142" path="/var/lib/kubelet/pods/30914703-c614-44c5-8add-1a131dfdd142/volumes" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.876861 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-metrics-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.876913 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-webhook-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.878190 4693 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.878252 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-metrics-certs podName:c80a0f65-6193-435f-8138-eb5a4ba71b22 nodeName:}" failed. No retries permitted until 2025-11-25 12:23:01.878234574 +0000 UTC m=+901.796319955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-metrics-certs") pod "openstack-operator-controller-manager-7cd5954d9-rqjq9" (UID: "c80a0f65-6193-435f-8138-eb5a4ba71b22") : secret "metrics-server-cert" not found Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.881206 4693 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.881267 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-webhook-certs podName:c80a0f65-6193-435f-8138-eb5a4ba71b22 nodeName:}" failed. No retries permitted until 2025-11-25 12:23:01.881250131 +0000 UTC m=+901.799335512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-webhook-certs") pod "openstack-operator-controller-manager-7cd5954d9-rqjq9" (UID: "c80a0f65-6193-435f-8138-eb5a4ba71b22") : secret "webhook-server-cert" not found Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.936538 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wpv8s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-864885998-tc9jb_openstack-operators(105791fd-407d-44a3-8fc8-af90e82b0f63): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.937335 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:5d49d4594c66eda7b151746cc6e1d3c67c0129b4503eeb043a64ae8ec2da6a1b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qrdnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-66cf5c67ff-k2njb_openstack-operators(1c7db975-17d7-48dd-8e5a-0549749ab866): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.948314 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-864885998-tc9jb"] Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.960873 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb"] Nov 25 12:23:00 crc kubenswrapper[4693]: W1125 12:23:00.961355 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6bc1c64_200f_492f_bad9_dfecd5687698.slice/crio-a41e17ba83a84c69cf6c35edb162d07f20b925047a97f55171bb33def1967ccc WatchSource:0}: Error finding container a41e17ba83a84c69cf6c35edb162d07f20b925047a97f55171bb33def1967ccc: Status 404 returned error can't find the container with id a41e17ba83a84c69cf6c35edb162d07f20b925047a97f55171bb33def1967ccc Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.961619 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wpv8s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-864885998-tc9jb_openstack-operators(105791fd-407d-44a3-8fc8-af90e82b0f63): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.961745 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qrdnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-66cf5c67ff-k2njb_openstack-operators(1c7db975-17d7-48dd-8e5a-0549749ab866): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.961890 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gb594,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-567f98c9d-cwrvs_openstack-operators(b9227546-dcce-4b09-9311-19f844deb318): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.963132 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" podUID="105791fd-407d-44a3-8fc8-af90e82b0f63" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.963218 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" podUID="1c7db975-17d7-48dd-8e5a-0549749ab866" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.964498 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27"] Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.975597 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gb594,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-567f98c9d-cwrvs_openstack-operators(b9227546-dcce-4b09-9311-19f844deb318): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.977067 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" podUID="b9227546-dcce-4b09-9311-19f844deb318" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.978353 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:c053e34316044f14929e16e4f0d97f9f1b24cb68b5e22b925ca74c66aaaed0a7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-btr47,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-79556f57fc-flxdz_openstack-operators(7ecc8c23-d9b2-4d46-a8b0-76758035b267): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.992222 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp"] Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.992583 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:442c269d79163f8da75505019c02e9f0815837aaadcaddacb8e6c12df297ca13,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w88f7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-fd75fd47d-g972v_openstack-operators(fe2a0074-66dc-4730-9321-772ee8fd8e28): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.992777 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9mqmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5db546f9d9-f4trp_openstack-operators(f6bc1c64-200f-492f-bad9-dfecd5687698): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.995975 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w88f7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-fd75fd47d-g972v_openstack-operators(fe2a0074-66dc-4730-9321-772ee8fd8e28): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.996039 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9mqmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5db546f9d9-f4trp_openstack-operators(f6bc1c64-200f-492f-bad9-dfecd5687698): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.997107 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" podUID="fe2a0074-66dc-4730-9321-772ee8fd8e28" Nov 25 12:23:00 crc kubenswrapper[4693]: E1125 12:23:00.997505 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" podUID="f6bc1c64-200f-492f-bad9-dfecd5687698" Nov 25 12:23:00 crc kubenswrapper[4693]: I1125 12:23:00.998971 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs"] Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.006525 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz"] Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.014192 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zq25c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-qbjp2_openstack-operators(28782f20-4534-4137-b590-7a3b31c638b2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.015649 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" podUID="28782f20-4534-4137-b590-7a3b31c638b2" Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.038429 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v"] Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.053561 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" event={"ID":"7cb65a4e-3294-4104-b3bf-6d1103b92c38","Type":"ContainerStarted","Data":"72ee8266c191c051fbf964f2b6099311286e4faf21ca9e6b9f68de056bef6b5e"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.054894 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" event={"ID":"2f11c884-15fc-4e2a-a533-d0eac0639f80","Type":"ContainerStarted","Data":"c2bb000e52ae3b3016a54d4aae21638162f1dd80536f2fa178254fca73f83a29"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.055767 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" event={"ID":"1c7db975-17d7-48dd-8e5a-0549749ab866","Type":"ContainerStarted","Data":"9c60eeb9c98e03a62ef3da9b4e369256e5102385a3fcfa78b4726fa66c8b503a"} Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.060271 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5d49d4594c66eda7b151746cc6e1d3c67c0129b4503eeb043a64ae8ec2da6a1b\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" podUID="1c7db975-17d7-48dd-8e5a-0549749ab866" Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.061467 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" event={"ID":"28782f20-4534-4137-b590-7a3b31c638b2","Type":"ContainerStarted","Data":"f6a0f7aead80cb3ce09cabc2d737f3d8a35834286a407d7a0dd14f2024d0443c"} Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.062815 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" podUID="28782f20-4534-4137-b590-7a3b31c638b2" Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.062955 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjkfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cb74df96-kmpm8_openstack-operators(ef0b302b-05d0-4be3-85ad-7eb3d70cec36): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.063976 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" event={"ID":"f6bc1c64-200f-492f-bad9-dfecd5687698","Type":"ContainerStarted","Data":"a41e17ba83a84c69cf6c35edb162d07f20b925047a97f55171bb33def1967ccc"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.064728 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2"] Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.065303 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjkfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cb74df96-kmpm8_openstack-operators(ef0b302b-05d0-4be3-85ad-7eb3d70cec36): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.065426 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" event={"ID":"4ab70f55-282f-4509-bc36-71ef2fe4d35b","Type":"ContainerStarted","Data":"dc9f0801635d00933028c97a5f912aa8bbb3c42bc075ba976a862d0b53218c2a"} Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.065915 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" podUID="f6bc1c64-200f-492f-bad9-dfecd5687698" Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.066682 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" event={"ID":"b29c9c21-026a-4701-99a7-769d382a2da2","Type":"ContainerStarted","Data":"5222fe510dcaa50246b4fbadd5976d0afd00bfb94c945da27b44efa2355e0b74"} Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.066751 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" podUID="ef0b302b-05d0-4be3-85ad-7eb3d70cec36" Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.068094 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" event={"ID":"105791fd-407d-44a3-8fc8-af90e82b0f63","Type":"ContainerStarted","Data":"030bab6abf57e858f31618bd314587547d51531a54a79790187c43cb5a82c8fb"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.070259 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" event={"ID":"0f35f544-581e-4cb2-900f-71213e27477d","Type":"ContainerStarted","Data":"cba4874e69f0453c382326b28b35e87fb257de5b31a156f6ec387a5464476c6f"} Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.071248 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" podUID="105791fd-407d-44a3-8fc8-af90e82b0f63" Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.072413 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" event={"ID":"3c29e8b9-57cf-4967-b5e2-a6af42c16099","Type":"ContainerStarted","Data":"8c8c6229d14370ac8d75bd741e0bf42f08006b7e5d47b995470df071cdec8e78"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.073818 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" event={"ID":"c3a7c8cb-ac3c-43d3-b38d-0c3625c53196","Type":"ContainerStarted","Data":"ebacd0fe5c3a763fd0b458a094edb7d9beca0caf64b361939ca0ccdb9cd93005"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.075201 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" event={"ID":"9cc5c4a9-0119-48b6-a795-9f482b55278b","Type":"ContainerStarted","Data":"22ca659bf34ffb248224b156aa811659ef83312c03b8d51c450436f7c591ce67"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.077768 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" event={"ID":"a64b0f5c-e6af-4903-925a-028aec5477fd","Type":"ContainerStarted","Data":"a3af83ca60b184469eff3885f219d006326b0b51ef206f374003284f00c1755c"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.078831 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" event={"ID":"fe2a0074-66dc-4730-9321-772ee8fd8e28","Type":"ContainerStarted","Data":"5efa8b0cb2aac5989d08f969cffbb32665ebe0a23a2f6eb88419e877c3d83f97"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.080318 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" event={"ID":"4dd9cd53-1f66-4636-9fab-9f0b3ff38009","Type":"ContainerStarted","Data":"25f4ab352c7b480e52c7bc7831a985f8dea13662d220d11d052c84d0a6c5a4b8"} Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.080805 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:442c269d79163f8da75505019c02e9f0815837aaadcaddacb8e6c12df297ca13\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" podUID="fe2a0074-66dc-4730-9321-772ee8fd8e28" Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.081411 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" event={"ID":"22a83ecc-1f72-4474-a470-2ee4bef7eddf","Type":"ContainerStarted","Data":"6cba1b8c3f2513f1ac92d67695c4779cafa3c0239fdc2eee03e14393c173540b"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.082280 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" event={"ID":"7ecc8c23-d9b2-4d46-a8b0-76758035b267","Type":"ContainerStarted","Data":"0da8ec76a4747458c0f80fbe6fb37218876d8bb04966a0345cc494d102a2389c"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.083123 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" event={"ID":"5c98082e-070e-42b1-afdc-69cea132629e","Type":"ContainerStarted","Data":"43e8ee0389e148fd0b11c8219fcca799986eefd16eb36da8a00c3b695f90187a"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.084050 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" event={"ID":"bfeee7c1-207f-4862-b172-f2ffab4a1500","Type":"ContainerStarted","Data":"6a961f123a33ee120610305e807c2d3a7ea04bbe7f1e7071d2ffd4efd05999a7"} Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.085587 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" event={"ID":"b9227546-dcce-4b09-9311-19f844deb318","Type":"ContainerStarted","Data":"49d9fce83d0193c10c9ececac9aae253bb0fa18466c40a0e16cc32f0cbe2090d"} Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.087692 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" podUID="b9227546-dcce-4b09-9311-19f844deb318" Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.121546 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8"] Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.188170 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7c4eb9b-38af-41da-872e-b3da515b2f88-cert\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-jlbhg\" (UID: \"a7c4eb9b-38af-41da-872e-b3da515b2f88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.189224 4693 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.189536 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c4eb9b-38af-41da-872e-b3da515b2f88-cert podName:a7c4eb9b-38af-41da-872e-b3da515b2f88 nodeName:}" failed. No retries permitted until 2025-11-25 12:23:03.189496505 +0000 UTC m=+903.107581886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7c4eb9b-38af-41da-872e-b3da515b2f88-cert") pod "openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" (UID: "a7c4eb9b-38af-41da-872e-b3da515b2f88") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.901186 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-metrics-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.901580 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-webhook-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.902366 4693 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 25 12:23:01 crc kubenswrapper[4693]: E1125 12:23:01.902441 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-webhook-certs podName:c80a0f65-6193-435f-8138-eb5a4ba71b22 nodeName:}" failed. No retries permitted until 2025-11-25 12:23:03.902423922 +0000 UTC m=+903.820509303 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-webhook-certs") pod "openstack-operator-controller-manager-7cd5954d9-rqjq9" (UID: "c80a0f65-6193-435f-8138-eb5a4ba71b22") : secret "webhook-server-cert" not found Nov 25 12:23:01 crc kubenswrapper[4693]: I1125 12:23:01.911808 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-metrics-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:02 crc kubenswrapper[4693]: I1125 12:23:02.098604 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" event={"ID":"ef0b302b-05d0-4be3-85ad-7eb3d70cec36","Type":"ContainerStarted","Data":"fca6796d94d91cc31fef08159fb9d8f8560b912fcf0fbb89fb7c93e4aa33d6fa"} Nov 25 12:23:02 crc kubenswrapper[4693]: E1125 12:23:02.102642 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" podUID="ef0b302b-05d0-4be3-85ad-7eb3d70cec36" Nov 25 12:23:02 crc kubenswrapper[4693]: E1125 12:23:02.103170 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" podUID="f6bc1c64-200f-492f-bad9-dfecd5687698" Nov 25 12:23:02 crc kubenswrapper[4693]: E1125 12:23:02.103604 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5d49d4594c66eda7b151746cc6e1d3c67c0129b4503eeb043a64ae8ec2da6a1b\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" podUID="1c7db975-17d7-48dd-8e5a-0549749ab866" Nov 25 12:23:02 crc kubenswrapper[4693]: E1125 12:23:02.103608 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:442c269d79163f8da75505019c02e9f0815837aaadcaddacb8e6c12df297ca13\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" podUID="fe2a0074-66dc-4730-9321-772ee8fd8e28" Nov 25 12:23:02 crc kubenswrapper[4693]: E1125 12:23:02.104169 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" podUID="b9227546-dcce-4b09-9311-19f844deb318" Nov 25 12:23:02 crc kubenswrapper[4693]: E1125 12:23:02.104458 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" podUID="28782f20-4534-4137-b590-7a3b31c638b2" Nov 25 12:23:02 crc kubenswrapper[4693]: E1125 12:23:02.105009 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" podUID="105791fd-407d-44a3-8fc8-af90e82b0f63" Nov 25 12:23:03 crc kubenswrapper[4693]: E1125 12:23:03.105802 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" podUID="28782f20-4534-4137-b590-7a3b31c638b2" Nov 25 12:23:03 crc kubenswrapper[4693]: E1125 12:23:03.107278 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" podUID="ef0b302b-05d0-4be3-85ad-7eb3d70cec36" Nov 25 12:23:03 crc kubenswrapper[4693]: I1125 12:23:03.217575 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7c4eb9b-38af-41da-872e-b3da515b2f88-cert\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-jlbhg\" (UID: \"a7c4eb9b-38af-41da-872e-b3da515b2f88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:23:03 crc kubenswrapper[4693]: I1125 12:23:03.226154 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7c4eb9b-38af-41da-872e-b3da515b2f88-cert\") pod \"openstack-baremetal-operator-controller-manager-b58f89467-jlbhg\" (UID: \"a7c4eb9b-38af-41da-872e-b3da515b2f88\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:23:03 crc kubenswrapper[4693]: I1125 12:23:03.390823 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-slsbp" Nov 25 12:23:03 crc kubenswrapper[4693]: I1125 12:23:03.399337 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:23:03 crc kubenswrapper[4693]: I1125 12:23:03.925993 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-webhook-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:03 crc kubenswrapper[4693]: I1125 12:23:03.929705 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c80a0f65-6193-435f-8138-eb5a4ba71b22-webhook-certs\") pod \"openstack-operator-controller-manager-7cd5954d9-rqjq9\" (UID: \"c80a0f65-6193-435f-8138-eb5a4ba71b22\") " pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:04 crc kubenswrapper[4693]: I1125 12:23:04.196240 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5mjsr" Nov 25 12:23:04 crc kubenswrapper[4693]: I1125 12:23:04.202110 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:18 crc kubenswrapper[4693]: E1125 12:23:18.801020 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:c6405d94e56b40ef669729216ab4b9c441f34bb280902efa2940038c076b560f" Nov 25 12:23:18 crc kubenswrapper[4693]: E1125 12:23:18.801705 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:c6405d94e56b40ef669729216ab4b9c441f34bb280902efa2940038c076b560f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9n87p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-7d695c9b56-6dtx6_openstack-operators(9cc5c4a9-0119-48b6-a795-9f482b55278b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:23:19 crc kubenswrapper[4693]: E1125 12:23:19.224062 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a" Nov 25 12:23:19 crc kubenswrapper[4693]: E1125 12:23:19.225025 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dq974,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-58bb8d67cc-5ghnq_openstack-operators(bfeee7c1-207f-4862-b172-f2ffab4a1500): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:23:20 crc kubenswrapper[4693]: E1125 12:23:20.054176 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0" Nov 25 12:23:20 crc kubenswrapper[4693]: E1125 12:23:20.054382 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6kknr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6fdc4fcf86-bnf27_openstack-operators(c3a7c8cb-ac3c-43d3-b38d-0c3625c53196): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:23:21 crc kubenswrapper[4693]: E1125 12:23:21.286470 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:3ef72bbd7cce89ff54d850ff44ca6d7b2360834a502da3d561aeb6fd3d9af50a" Nov 25 12:23:21 crc kubenswrapper[4693]: E1125 12:23:21.286945 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:3ef72bbd7cce89ff54d850ff44ca6d7b2360834a502da3d561aeb6fd3d9af50a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-khwc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-748dc6576f-zcpsz_openstack-operators(a64b0f5c-e6af-4903-925a-028aec5477fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:23:26 crc kubenswrapper[4693]: E1125 12:23:26.256745 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04" Nov 25 12:23:26 crc kubenswrapper[4693]: E1125 12:23:26.257562 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tx5gs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-cb6c4fdb7-s9shw_openstack-operators(22a83ecc-1f72-4474-a470-2ee4bef7eddf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:23:26 crc kubenswrapper[4693]: I1125 12:23:26.493648 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9"] Nov 25 12:23:28 crc kubenswrapper[4693]: I1125 12:23:28.304279 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg"] Nov 25 12:23:28 crc kubenswrapper[4693]: W1125 12:23:28.311333 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7c4eb9b_38af_41da_872e_b3da515b2f88.slice/crio-2308b46e6e40542dc1cf61c5db582fe19dc64909f0aff4ddaf167bf4697ca4e9 WatchSource:0}: Error finding container 2308b46e6e40542dc1cf61c5db582fe19dc64909f0aff4ddaf167bf4697ca4e9: Status 404 returned error can't find the container with id 2308b46e6e40542dc1cf61c5db582fe19dc64909f0aff4ddaf167bf4697ca4e9 Nov 25 12:23:28 crc kubenswrapper[4693]: I1125 12:23:28.395487 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" event={"ID":"c80a0f65-6193-435f-8138-eb5a4ba71b22","Type":"ContainerStarted","Data":"d85e2750e40e8d1b5b5f6b18e226b29cd2d891c861cf1d4aaa0c8ba80abd1f7e"} Nov 25 12:23:28 crc kubenswrapper[4693]: I1125 12:23:28.398356 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" event={"ID":"4ab70f55-282f-4509-bc36-71ef2fe4d35b","Type":"ContainerStarted","Data":"5b4de767d2aad21d15a4460a613fb046c8e38ea7b03c41a5db8cf688493820c2"} Nov 25 12:23:28 crc kubenswrapper[4693]: I1125 12:23:28.399610 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" event={"ID":"a7c4eb9b-38af-41da-872e-b3da515b2f88","Type":"ContainerStarted","Data":"2308b46e6e40542dc1cf61c5db582fe19dc64909f0aff4ddaf167bf4697ca4e9"} Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.407215 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" event={"ID":"0f35f544-581e-4cb2-900f-71213e27477d","Type":"ContainerStarted","Data":"4ea6ff7d5e76fde10f591e8024510800752b4480ce89adb7f4b46a5d8acf1e11"} Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.409876 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" event={"ID":"1c7db975-17d7-48dd-8e5a-0549749ab866","Type":"ContainerStarted","Data":"4005ff4e6f0777123cda9b38521e0a98cf316b807919df05e588c2235579f4a9"} Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.411611 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" event={"ID":"b29c9c21-026a-4701-99a7-769d382a2da2","Type":"ContainerStarted","Data":"face5f24f67a4abbb0ec061572bb135258d901f43ebebe0ae7dfa2c8eee90cd7"} Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.413121 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" event={"ID":"2f11c884-15fc-4e2a-a533-d0eac0639f80","Type":"ContainerStarted","Data":"c133d166c5bdf16a1e28b10129f73de03ab48a6583ec62dd9c5aa2f8c72b92e6"} Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.414661 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" event={"ID":"3c29e8b9-57cf-4967-b5e2-a6af42c16099","Type":"ContainerStarted","Data":"526fd6415be6e0e16318eee3605a03a9f940d9cd41a789b9d17dab825cae7f64"} Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.416078 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" event={"ID":"4dd9cd53-1f66-4636-9fab-9f0b3ff38009","Type":"ContainerStarted","Data":"a8e22ea9c6297db7676d5003b03a7b98815c1563cb3cd4b959529c23feb4b068"} Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.417571 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" event={"ID":"fe2a0074-66dc-4730-9321-772ee8fd8e28","Type":"ContainerStarted","Data":"4c72935e8f3b0f44c87f61604309800764180e90f764c2095fb8d658245ef2aa"} Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.419039 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" event={"ID":"5c98082e-070e-42b1-afdc-69cea132629e","Type":"ContainerStarted","Data":"96d6732128d018481c9c57f8412853a6a18b9b0a5b93dba81049f56bfa6bb6ad"} Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.420603 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" event={"ID":"c80a0f65-6193-435f-8138-eb5a4ba71b22","Type":"ContainerStarted","Data":"0939e18bfe497776320fc5084e6673c9353b0aa99a7d5728bb9bfff0d248e5cb"} Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.420781 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.422679 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" event={"ID":"b9227546-dcce-4b09-9311-19f844deb318","Type":"ContainerStarted","Data":"d18ccdab540642b91a8050bb99fcb971d006c2fbaf87724cade89f3cfd6b5d2e"} Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.424247 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" event={"ID":"7cb65a4e-3294-4104-b3bf-6d1103b92c38","Type":"ContainerStarted","Data":"04b9c09e42dc86f972d449310f260b434a79a4329b2f39c1dd3d390775a4a871"} Nov 25 12:23:29 crc kubenswrapper[4693]: I1125 12:23:29.488428 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" podStartSLOduration=30.488410023 podStartE2EDuration="30.488410023s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:23:29.486673304 +0000 UTC m=+929.404758685" watchObservedRunningTime="2025-11-25 12:23:29.488410023 +0000 UTC m=+929.406495404" Nov 25 12:23:30 crc kubenswrapper[4693]: E1125 12:23:30.056540 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c" Nov 25 12:23:30 crc kubenswrapper[4693]: E1125 12:23:30.056971 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9mqmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5db546f9d9-f4trp_openstack-operators(f6bc1c64-200f-492f-bad9-dfecd5687698): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:23:32 crc kubenswrapper[4693]: E1125 12:23:32.205178 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d" Nov 25 12:23:32 crc kubenswrapper[4693]: E1125 12:23:32.206519 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjkfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cb74df96-kmpm8_openstack-operators(ef0b302b-05d0-4be3-85ad-7eb3d70cec36): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:23:34 crc kubenswrapper[4693]: I1125 12:23:34.208455 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:23:47 crc kubenswrapper[4693]: E1125 12:23:47.963665 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:23:47 crc kubenswrapper[4693]: E1125 12:23:47.964249 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dq974,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-58bb8d67cc-5ghnq_openstack-operators(bfeee7c1-207f-4862-b172-f2ffab4a1500): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:23:47 crc kubenswrapper[4693]: E1125 12:23:47.965557 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" podUID="bfeee7c1-207f-4862-b172-f2ffab4a1500" Nov 25 12:23:48 crc kubenswrapper[4693]: I1125 12:23:48.571896 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 12:23:49 crc kubenswrapper[4693]: E1125 12:23:49.759635 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Nov 25 12:23:49 crc kubenswrapper[4693]: E1125 12:23:49.760095 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zq25c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-qbjp2_openstack-operators(28782f20-4534-4137-b590-7a3b31c638b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:23:49 crc kubenswrapper[4693]: E1125 12:23:49.762189 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" podUID="28782f20-4534-4137-b590-7a3b31c638b2" Nov 25 12:23:58 crc kubenswrapper[4693]: E1125 12:23:58.979215 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd" Nov 25 12:23:58 crc kubenswrapper[4693]: E1125 12:23:58.979928 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:7dbadf7b98f2f305f9f1382f55a084c8ca404f4263f76b28e56bd0dc437e2192,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:0473ff9eec0da231e2d0a10bf1abbe1dfa1a0f95b8f619e3a07605386951449a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:c8101c77a82eae4407e41e1fd766dfc6e1b7f9ed1679e3efb6f91ff97a1557b2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:eb9743b21bbadca6f7cb9ac4fc46b5d58c51c674073c7e1121f4474a71304071,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:3d81f839b98c2e2a5bf0da79f2f9a92dff7d0a3c5a830b0e95c89dad8cf98a6a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:d19ac99249b47dd8ea16cd6aaa5756346aa8a2f119ee50819c15c5366efb417d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:8536169e5537fe6c330eba814248abdcf39cdd8f7e7336034d74e6fda9544050,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:4c93a5cccb9971e24f05daf93b3aa11ba71752bc3469a1a1a2c4906f92f69645,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:4f1fa337760e82bfd67cdd142a97c121146dd7e621daac161940dd5e4ddb80dc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:3613b345d5baed98effd906f8b0242d863e14c97078ea473ef01fe1b0afc46f3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:d375d370be5ead0dac71109af644849e5795f535f9ad8eeacea261d77ae6f140,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:b73ad22b4955b06d584bce81742556d8c0c7828c495494f8ea7c99391c61b70f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:aa1d3aaf6b394621ed4089a98e0a82b763f467e8b5c5db772f9fdf99fc86e333,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:d6661053141b6df421288a7c9968a155ab82e478c1d75ab41f2cebe2f0ca02d2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:ce2d63258cb4e7d0d1c07234de6889c5434464190906798019311a1c7cf6387f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:0485ef9e5b4437f7cd2ba54034a87722ce4669ee86b3773c6b0c037ed8000e91,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api@sha256:43f8a00cd714c59f2c517fe6fabb63b16528191633eb39eef4002d49ace7ddb0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor@sha256:876a222b97b38b35012883c4146c8d102d019fcbe79f26d731d6f2e225e22ffc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:ff0c553ceeb2e0f44b010e37dc6d0db8a251797b88e56468b7cf7f05253e4232,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:624f553f073af7493d34828b074adc9981cce403edd8e71482c7307008479fd9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:e3874936a518c8560339db8f840fc5461885819f6050b5de8d3ab9199bea5094,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:1cea25f1d2a45affc80c46fb9d427749d3f06b61590ac6070a2910e3ec8a4e5d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:e36d5b9a65194f12f7b01c6422ba3ed52a687fd1695fbb21f4986c67d9f9317f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:8b21bec527d54cd766e277889df6bcccd2baeaa946274606b986c0c3b7ca689f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:45aceca77f8fcf61127f0da650bdfdf11ede9b0944c78b63fab819d03283f96b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:709ac58998927dd61786821ae1e63343fd97ccf5763aac5edb4583eea9401d22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:867d4ef7c21f75e6030a685b5762ab4d84b671316ed6b98d75200076e93342cd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:2b90da93550b99d2fcfa95bd819f3363aa68346a416f8dc7baac3e9c5f487761,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:8cde52cef8795d1c91983b100d86541c7718160ec260fe0f97b96add4c2c8ee8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:835ebed082fe1c45bd799d1d5357595ce63efeb05ca876f26b08443facb9c164,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:011d682241db724bc40736c9b54d2ea450ea7e6be095b1ff5fa28c8007466775,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:2025da90cff8f563deb08bee71efe16d4078edc2a767b2e225cca5c77f1aa2f9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:26bd7b0bd6070856aefef6fe754c547d55c056396ea30d879d34c2d49b5a1d29,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:ff46cd5e0e13d105c4629e78c2734a50835f06b6a1e31da9e0462981d10c4be3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:5b4fd0c2b76fa5539f74687b11c5882d77bd31352452322b37ff51fa18f12a61,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:5e03376bd895346dc8f627ca15ded942526ed8b5e92872f453ce272e694d18d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:36a0fb31978aee0ded2483de311631e64a644d0b0685b5b055f65ede7eb8e8a2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:5f6045841aff0fde6f684a34cdf49f8dc7b2c3bcbdeab201f1058971e0c5f79e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:448f4e1b740c30936e340bd6e8534d78c83357bf373a4223950aa64d3484f007,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:b68e3615af8a0eb0ef6bf9ceeef59540a6f4a9a85f6078a3620be115c73a7db8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:7eae01cf60383e523c9cd94d158a9162120a7370829a1dad20fdea6b0fd660bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:28cc10501788081eb61b5a1af35546191a92741f4f109df54c74e2b19439d0f9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:9a616e37acfd120612f78043237a8541266ba34883833c9beb43f3da313661ad,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:6b1be6cd94a0942259bca5d5d2c30cc7de4a33276b61f8ae3940226772106256,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:02d2c22d15401574941fbe057095442dee0d6f7a0a9341de35d25e6a12a3fe4b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:fc3b3a36b74fd653946723c54b208072d52200635850b531e9d595a7aaea5a01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:7850ccbff320bf9a1c9c769c1c70777eb97117dd8cd5ae4435be9b4622cf807a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:397dac7e39cf40d14a986e6ec4a60fb698ca35c197d0db315b1318514cc6d1d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:1c95142a36276686e720f86423ee171dc9adcc1e89879f627545b7c906ccd9bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:e331a8fde6638e5ba154c4f0b38772a9a424f60656f2777245975fb1fa02f07d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:cd3cf7a34053e850b4d4f9f4ea4c74953a54a42fd18e47d7c01d44a88923e925,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:aee28476344fc0cc148fbe97daf9b1bfcedc22001550bba4bdc4e84be7b6989d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:cfa0b92c976603ee2a937d34013a238fcd8aa75f998e50642e33489f14124633,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:73c2f2d6eecf88acf4e45b133c8373d9bb006b530e0aff0b28f3b7420620a874,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:927b405cc04abe5ff716186e8d35e2dc5fad1c8430194659ee6617d74e4e055d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:6154d7cebd7c339afa5b86330262156171743aa5b79c2b78f9a2f378005ed8fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:e2db2f4af8d3d0be7868c6efef0189f3a2c74a8f96ae10e3f991cdf83feaef29,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:c773629df257726a6d3cacc24a6e4df0babcd7d37df04e6d14676a8da028b9c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:776211111e2e6493706dbc49a3ba44f31d1b947919313ed3a0f35810e304ec52,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:ae4a20d9aad04cfaeaa3105fa8e37db4216c3b17530bc98daf1204555bc23485,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:7cccf24ad0a152f90ca39893064f48a1656950ee8142685a5d482c71f0bdc9f5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:05450b48f6b5352b2686a26e933e8727748edae2ae9652d9164b7d7a1817c55a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:fc9c99eeef91523482bd8f92661b393287e1f2a24ad2ba9e33191f8de9af74cf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:3e4ecc02b4b5e0860482a93599ba9ca598c5ce26c093c46e701f96fe51acb208,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:2346037e064861c7892690d2e8b3e1eea1a26ce3c3a11fda0b41301965bc828c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:7dd2e0dbb6bb5a6cecd1763e43479ca8cb6a0c502534e83c8795c0da2b50e099,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:95d67f51dfedd5bd3ec785b488425295b2d8c41feae3e6386ef471615381809b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:c26c3ff9cabe3593ceb10006e782bf9391ac14785768ce9eec4f938c2d3cf228,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:273fe8c27d08d0f62773a02f8cef6a761a7768116ee1a4be611f93bbf63f2b75,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:daa45220bb1c47922d0917aa8fe423bb82b03a01429f1c9e37635e701e352d71,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:a80a074e227d3238bb6f285788a9e886ae7a5909ccbc5c19c93c369bdfe5b3b8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:58ac66ca1be01fe0157977bd79a26cde4d0de153edfaf4162367c924826b2ef4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:5e3f93f3085cfd94e599bbf771635477e5e015b7c22c624edca926459d369e69,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:6edd7f91b0fc53dd91194f6e0c206a98e5667bb7a9c5f2a423349612d7300506,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:2590b6c6197091ca423dfb93a609e0d843b270ad642f0c1920ac23f79aec8dca,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4pklr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-b58f89467-jlbhg_openstack-operators(a7c4eb9b-38af-41da-872e-b3da515b2f88): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:24:02 crc kubenswrapper[4693]: E1125 12:24:02.814895 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" podUID="28782f20-4534-4137-b590-7a3b31c638b2" Nov 25 12:24:04 crc kubenswrapper[4693]: E1125 12:24:04.130694 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f" Nov 25 12:24:04 crc kubenswrapper[4693]: E1125 12:24:04.131231 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wpv8s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-864885998-tc9jb_openstack-operators(105791fd-407d-44a3-8fc8-af90e82b0f63): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:24:04 crc kubenswrapper[4693]: E1125 12:24:04.776005 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:04 crc kubenswrapper[4693]: E1125 12:24:04.775969 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:04 crc kubenswrapper[4693]: E1125 12:24:04.776366 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rxn6j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-d5cc86f4b-r86ct_openstack-operators(5c98082e-070e-42b1-afdc-69cea132629e): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:04 crc kubenswrapper[4693]: E1125 12:24:04.776364 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qrdnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-66cf5c67ff-k2njb_openstack-operators(1c7db975-17d7-48dd-8e5a-0549749ab866): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:04 crc kubenswrapper[4693]: E1125 12:24:04.778275 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" podUID="1c7db975-17d7-48dd-8e5a-0549749ab866" Nov 25 12:24:04 crc kubenswrapper[4693]: E1125 12:24:04.778319 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" podUID="5c98082e-070e-42b1-afdc-69cea132629e" Nov 25 12:24:05 crc kubenswrapper[4693]: I1125 12:24:05.114023 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:24:05 crc kubenswrapper[4693]: I1125 12:24:05.114111 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:24:05 crc kubenswrapper[4693]: E1125 12:24:05.411225 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:05 crc kubenswrapper[4693]: E1125 12:24:05.411534 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tx5gs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-cb6c4fdb7-s9shw_openstack-operators(22a83ecc-1f72-4474-a470-2ee4bef7eddf): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:05 crc kubenswrapper[4693]: E1125 12:24:05.412744 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" podUID="22a83ecc-1f72-4474-a470-2ee4bef7eddf" Nov 25 12:24:05 crc kubenswrapper[4693]: E1125 12:24:05.698381 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" podUID="1c7db975-17d7-48dd-8e5a-0549749ab866" Nov 25 12:24:05 crc kubenswrapper[4693]: E1125 12:24:05.698829 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" podUID="5c98082e-070e-42b1-afdc-69cea132629e" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.195675 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.195963 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q8js7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-68b95954c9-866fd_openstack-operators(7cb65a4e-3294-4104-b3bf-6d1103b92c38): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.197302 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" podUID="7cb65a4e-3294-4104-b3bf-6d1103b92c38" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.213410 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.213566 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9mqmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5db546f9d9-f4trp_openstack-operators(f6bc1c64-200f-492f-bad9-dfecd5687698): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.214562 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.214734 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" podUID="f6bc1c64-200f-492f-bad9-dfecd5687698" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.214835 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-smwnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-79856dc55c-4lt8v_openstack-operators(4ab70f55-282f-4509-bc36-71ef2fe4d35b): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.215941 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.216014 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" podUID="4ab70f55-282f-4509-bc36-71ef2fe4d35b" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.216143 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hmf9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-5bfcdc958c-szrv4_openstack-operators(3c29e8b9-57cf-4967-b5e2-a6af42c16099): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.217620 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" podUID="3c29e8b9-57cf-4967-b5e2-a6af42c16099" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.453717 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.453893 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjkfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cb74df96-kmpm8_openstack-operators(ef0b302b-05d0-4be3-85ad-7eb3d70cec36): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.453956 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.454613 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gb594,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-567f98c9d-cwrvs_openstack-operators(b9227546-dcce-4b09-9311-19f844deb318): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.455362 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" podUID="ef0b302b-05d0-4be3-85ad-7eb3d70cec36" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.456548 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" podUID="b9227546-dcce-4b09-9311-19f844deb318" Nov 25 12:24:06 crc kubenswrapper[4693]: I1125 12:24:06.704220 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" Nov 25 12:24:06 crc kubenswrapper[4693]: I1125 12:24:06.704529 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" Nov 25 12:24:06 crc kubenswrapper[4693]: I1125 12:24:06.704975 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.705826 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" podUID="b9227546-dcce-4b09-9311-19f844deb318" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.706230 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" podUID="4ab70f55-282f-4509-bc36-71ef2fe4d35b" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.706551 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" podUID="3c29e8b9-57cf-4967-b5e2-a6af42c16099" Nov 25 12:24:06 crc kubenswrapper[4693]: E1125 12:24:06.706558 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" podUID="7cb65a4e-3294-4104-b3bf-6d1103b92c38" Nov 25 12:24:06 crc kubenswrapper[4693]: I1125 12:24:06.709291 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" Nov 25 12:24:06 crc kubenswrapper[4693]: I1125 12:24:06.710938 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" Nov 25 12:24:06 crc kubenswrapper[4693]: I1125 12:24:06.711899 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" Nov 25 12:24:07 crc kubenswrapper[4693]: E1125 12:24:07.185321 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:07 crc kubenswrapper[4693]: E1125 12:24:07.185561 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hrj5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c9694994-fwwsj_openstack-operators(4dd9cd53-1f66-4636-9fab-9f0b3ff38009): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:07 crc kubenswrapper[4693]: E1125 12:24:07.187027 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" podUID="4dd9cd53-1f66-4636-9fab-9f0b3ff38009" Nov 25 12:24:08 crc kubenswrapper[4693]: E1125 12:24:08.387528 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" podUID="4ab70f55-282f-4509-bc36-71ef2fe4d35b" Nov 25 12:24:08 crc kubenswrapper[4693]: E1125 12:24:08.387541 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" podUID="3c29e8b9-57cf-4967-b5e2-a6af42c16099" Nov 25 12:24:08 crc kubenswrapper[4693]: E1125 12:24:08.387580 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" podUID="4dd9cd53-1f66-4636-9fab-9f0b3ff38009" Nov 25 12:24:08 crc kubenswrapper[4693]: E1125 12:24:08.387581 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" podUID="b9227546-dcce-4b09-9311-19f844deb318" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.160705 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" podUID="3c29e8b9-57cf-4967-b5e2-a6af42c16099" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.161024 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" podUID="b9227546-dcce-4b09-9311-19f844deb318" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.161099 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" podUID="4ab70f55-282f-4509-bc36-71ef2fe4d35b" Nov 25 12:24:09 crc kubenswrapper[4693]: I1125 12:24:09.301469 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" Nov 25 12:24:09 crc kubenswrapper[4693]: I1125 12:24:09.305207 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.363609 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.363807 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pv9nf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-774b86978c-nzz29_openstack-operators(b29c9c21-026a-4701-99a7-769d382a2da2): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.365042 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" podUID="b29c9c21-026a-4701-99a7-769d382a2da2" Nov 25 12:24:09 crc kubenswrapper[4693]: I1125 12:24:09.453715 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:24:09 crc kubenswrapper[4693]: I1125 12:24:09.461403 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:24:09 crc kubenswrapper[4693]: I1125 12:24:09.514762 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" Nov 25 12:24:09 crc kubenswrapper[4693]: I1125 12:24:09.518159 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.586277 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.586430 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x4v48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-86dc4d89c8-6wxtj_openstack-operators(2f11c884-15fc-4e2a-a533-d0eac0639f80): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.587570 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" podUID="2f11c884-15fc-4e2a-a533-d0eac0639f80" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.730936 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:09 crc kubenswrapper[4693]: I1125 12:24:09.730989 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" Nov 25 12:24:09 crc kubenswrapper[4693]: I1125 12:24:09.731529 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.731523 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-btr47,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-79556f57fc-flxdz_openstack-operators(7ecc8c23-d9b2-4d46-a8b0-76758035b267): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.732891 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" podUID="7ecc8c23-d9b2-4d46-a8b0-76758035b267" Nov 25 12:24:09 crc kubenswrapper[4693]: I1125 12:24:09.736008 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" Nov 25 12:24:09 crc kubenswrapper[4693]: I1125 12:24:09.740002 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" Nov 25 12:24:09 crc kubenswrapper[4693]: I1125 12:24:09.844754 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" Nov 25 12:24:09 crc kubenswrapper[4693]: I1125 12:24:09.848125 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.933217 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" podUID="a7c4eb9b-38af-41da-872e-b3da515b2f88" Nov 25 12:24:09 crc kubenswrapper[4693]: E1125 12:24:09.933867 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" podUID="9cc5c4a9-0119-48b6-a795-9f482b55278b" Nov 25 12:24:10 crc kubenswrapper[4693]: E1125 12:24:10.365660 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:10 crc kubenswrapper[4693]: E1125 12:24:10.365830 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w88f7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-fd75fd47d-g972v_openstack-operators(fe2a0074-66dc-4730-9321-772ee8fd8e28): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:10 crc kubenswrapper[4693]: E1125 12:24:10.366995 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" podUID="fe2a0074-66dc-4730-9321-772ee8fd8e28" Nov 25 12:24:10 crc kubenswrapper[4693]: E1125 12:24:10.693950 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:10 crc kubenswrapper[4693]: E1125 12:24:10.694133 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6kknr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6fdc4fcf86-bnf27_openstack-operators(c3a7c8cb-ac3c-43d3-b38d-0c3625c53196): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:10 crc kubenswrapper[4693]: E1125 12:24:10.695987 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" podUID="c3a7c8cb-ac3c-43d3-b38d-0c3625c53196" Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.738733 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" event={"ID":"2f11c884-15fc-4e2a-a533-d0eac0639f80","Type":"ContainerStarted","Data":"45eb808bf770fe646c5fc133b554a68d9575b22c9274239ed52284850ae8451b"} Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.741120 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" event={"ID":"4dd9cd53-1f66-4636-9fab-9f0b3ff38009","Type":"ContainerStarted","Data":"2fbeaf3f78e3e4c4e41d41cdbedeae4d5761fdf54c753a860d6880afb7cb362a"} Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.742847 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" event={"ID":"1c7db975-17d7-48dd-8e5a-0549749ab866","Type":"ContainerStarted","Data":"600942e9c28ee791cd0975dd9d8df420aafead0d5c2d6eea86c89d7cad6e9056"} Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.744594 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" event={"ID":"5c98082e-070e-42b1-afdc-69cea132629e","Type":"ContainerStarted","Data":"ce6a84dc38e3fe498d65c9030c5ed634bf7b48c817d70f300551102749264f0e"} Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.745969 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" event={"ID":"bfeee7c1-207f-4862-b172-f2ffab4a1500","Type":"ContainerStarted","Data":"04fa4b8a1a141683ff4f52679e62bb1f0821120c98acdf0d250e3171e32b3d56"} Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.745998 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" event={"ID":"bfeee7c1-207f-4862-b172-f2ffab4a1500","Type":"ContainerStarted","Data":"606ac57bd54bda616ef63549e79ade51118f68fc2004439f20bd3928b67d782c"} Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.746167 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.747480 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" event={"ID":"b29c9c21-026a-4701-99a7-769d382a2da2","Type":"ContainerStarted","Data":"5decf82f2dbf2494be212f787aa35da3ad4eb7b68b84967e7ec4bdbb9e4e3259"} Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.749097 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" event={"ID":"7cb65a4e-3294-4104-b3bf-6d1103b92c38","Type":"ContainerStarted","Data":"4c4e7825bd423d88033b8260c307550d3c0c1d69d34608ecb93c90323adb615a"} Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.750475 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" event={"ID":"22a83ecc-1f72-4474-a470-2ee4bef7eddf","Type":"ContainerStarted","Data":"927e5034250e21dfca78b21626aa23b3350880744a619066584cc078dcd4494f"} Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.750498 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" event={"ID":"22a83ecc-1f72-4474-a470-2ee4bef7eddf","Type":"ContainerStarted","Data":"68c3e5c6880f7109205001bca2c88ce5139e3c4f07e97bd9efc93eb7b54c536b"} Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.750862 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.751794 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" event={"ID":"a7c4eb9b-38af-41da-872e-b3da515b2f88","Type":"ContainerStarted","Data":"4f9ab1305ff19e6acb4fe7e2a5df97d8ec462917bf8ff151c5806fda7a91db0a"} Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.754168 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" event={"ID":"9cc5c4a9-0119-48b6-a795-9f482b55278b","Type":"ContainerStarted","Data":"167425cb2414a9dc06df679237d988d5fa2777c8b5fb71e5910306a52464d6a6"} Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.754901 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" Nov 25 12:24:10 crc kubenswrapper[4693]: E1125 12:24:10.760094 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:10 crc kubenswrapper[4693]: E1125 12:24:10.760456 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sglkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7c57c8bbc4-csrpt_openstack-operators(0f35f544-581e-4cb2-900f-71213e27477d): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.760515 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" Nov 25 12:24:10 crc kubenswrapper[4693]: E1125 12:24:10.762074 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" podUID="0f35f544-581e-4cb2-900f-71213e27477d" Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.765288 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" podStartSLOduration=53.537172756 podStartE2EDuration="1m12.765274228s" podCreationTimestamp="2025-11-25 12:22:58 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.434852753 +0000 UTC m=+900.352938134" lastFinishedPulling="2025-11-25 12:23:19.662954235 +0000 UTC m=+919.581039606" observedRunningTime="2025-11-25 12:24:10.762384816 +0000 UTC m=+970.680470197" watchObservedRunningTime="2025-11-25 12:24:10.765274228 +0000 UTC m=+970.683359609" Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.794339 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" podStartSLOduration=51.392839954 podStartE2EDuration="1m12.794316826s" podCreationTimestamp="2025-11-25 12:22:58 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.634358668 +0000 UTC m=+900.552444049" lastFinishedPulling="2025-11-25 12:23:22.03583553 +0000 UTC m=+921.953920921" observedRunningTime="2025-11-25 12:24:10.788168593 +0000 UTC m=+970.706253974" watchObservedRunningTime="2025-11-25 12:24:10.794316826 +0000 UTC m=+970.712402207" Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.815243 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" podStartSLOduration=48.268573114 podStartE2EDuration="1m12.815167994s" podCreationTimestamp="2025-11-25 12:22:58 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.434595126 +0000 UTC m=+900.352680507" lastFinishedPulling="2025-11-25 12:23:24.981190006 +0000 UTC m=+924.899275387" observedRunningTime="2025-11-25 12:24:10.8058117 +0000 UTC m=+970.723897081" watchObservedRunningTime="2025-11-25 12:24:10.815167994 +0000 UTC m=+970.733253375" Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.874895 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" podStartSLOduration=44.452948981 podStartE2EDuration="1m11.874878098s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.937161519 +0000 UTC m=+900.855246900" lastFinishedPulling="2025-11-25 12:23:28.359090636 +0000 UTC m=+928.277176017" observedRunningTime="2025-11-25 12:24:10.874311292 +0000 UTC m=+970.792396673" watchObservedRunningTime="2025-11-25 12:24:10.874878098 +0000 UTC m=+970.792963479" Nov 25 12:24:10 crc kubenswrapper[4693]: I1125 12:24:10.974314 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" podStartSLOduration=2.758237252 podStartE2EDuration="1m11.974295611s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.723588933 +0000 UTC m=+900.641674314" lastFinishedPulling="2025-11-25 12:24:09.939647292 +0000 UTC m=+969.857732673" observedRunningTime="2025-11-25 12:24:10.971191594 +0000 UTC m=+970.889276975" watchObservedRunningTime="2025-11-25 12:24:10.974295611 +0000 UTC m=+970.892380992" Nov 25 12:24:11 crc kubenswrapper[4693]: I1125 12:24:11.015127 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" podStartSLOduration=51.412856517 podStartE2EDuration="1m13.015106561s" podCreationTimestamp="2025-11-25 12:22:58 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.43335696 +0000 UTC m=+900.351442341" lastFinishedPulling="2025-11-25 12:23:22.035607004 +0000 UTC m=+921.953692385" observedRunningTime="2025-11-25 12:24:11.014914236 +0000 UTC m=+970.932999617" watchObservedRunningTime="2025-11-25 12:24:11.015106561 +0000 UTC m=+970.933191942" Nov 25 12:24:11 crc kubenswrapper[4693]: I1125 12:24:11.019976 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" podStartSLOduration=53.742117092 podStartE2EDuration="1m13.019954627s" podCreationTimestamp="2025-11-25 12:22:58 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.385435479 +0000 UTC m=+900.303520860" lastFinishedPulling="2025-11-25 12:23:19.663273024 +0000 UTC m=+919.581358395" observedRunningTime="2025-11-25 12:24:10.999837701 +0000 UTC m=+970.917923102" watchObservedRunningTime="2025-11-25 12:24:11.019954627 +0000 UTC m=+970.938040018" Nov 25 12:24:11 crc kubenswrapper[4693]: I1125 12:24:11.049027 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" podStartSLOduration=3.219175814 podStartE2EDuration="1m12.049010287s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.619802655 +0000 UTC m=+900.537888036" lastFinishedPulling="2025-11-25 12:24:09.449637128 +0000 UTC m=+969.367722509" observedRunningTime="2025-11-25 12:24:11.043280465 +0000 UTC m=+970.961365846" watchObservedRunningTime="2025-11-25 12:24:11.049010287 +0000 UTC m=+970.967095668" Nov 25 12:24:11 crc kubenswrapper[4693]: I1125 12:24:11.764271 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" event={"ID":"fe2a0074-66dc-4730-9321-772ee8fd8e28","Type":"ContainerStarted","Data":"940f0cd07257f64aa872ed6eaa401216f9e62e62b8c7df5317f9e23e65b97d06"} Nov 25 12:24:11 crc kubenswrapper[4693]: I1125 12:24:11.818094 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" podStartSLOduration=45.483033222 podStartE2EDuration="1m12.818071778s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.992467249 +0000 UTC m=+900.910552630" lastFinishedPulling="2025-11-25 12:23:28.327505805 +0000 UTC m=+928.245591186" observedRunningTime="2025-11-25 12:24:11.800164954 +0000 UTC m=+971.718250415" watchObservedRunningTime="2025-11-25 12:24:11.818071778 +0000 UTC m=+971.736157169" Nov 25 12:24:11 crc kubenswrapper[4693]: E1125 12:24:11.881385 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Nov 25 12:24:11 crc kubenswrapper[4693]: E1125 12:24:11.881535 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-khwc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-748dc6576f-zcpsz_openstack-operators(a64b0f5c-e6af-4903-925a-028aec5477fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:24:11 crc kubenswrapper[4693]: E1125 12:24:11.882791 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" podUID="a64b0f5c-e6af-4903-925a-028aec5477fd" Nov 25 12:24:12 crc kubenswrapper[4693]: E1125 12:24:12.096665 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" podUID="105791fd-407d-44a3-8fc8-af90e82b0f63" Nov 25 12:24:12 crc kubenswrapper[4693]: I1125 12:24:12.771263 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" event={"ID":"0f35f544-581e-4cb2-900f-71213e27477d","Type":"ContainerStarted","Data":"9854c4b6b2c523a64edf9971a62151d2e2239d98a89ce6177c7c99ddd699af5f"} Nov 25 12:24:12 crc kubenswrapper[4693]: I1125 12:24:12.772413 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" Nov 25 12:24:12 crc kubenswrapper[4693]: I1125 12:24:12.772511 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" event={"ID":"105791fd-407d-44a3-8fc8-af90e82b0f63","Type":"ContainerStarted","Data":"4d3e39b51b9f5fede5c0f0a6435ff6670dba1d7bbd86c591227bede7c64ee7da"} Nov 25 12:24:12 crc kubenswrapper[4693]: I1125 12:24:12.773913 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" Nov 25 12:24:12 crc kubenswrapper[4693]: E1125 12:24:12.774662 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" podUID="105791fd-407d-44a3-8fc8-af90e82b0f63" Nov 25 12:24:12 crc kubenswrapper[4693]: I1125 12:24:12.795065 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" podStartSLOduration=53.189613121 podStartE2EDuration="1m13.795045841s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.634195214 +0000 UTC m=+900.552280595" lastFinishedPulling="2025-11-25 12:23:21.239627914 +0000 UTC m=+921.157713315" observedRunningTime="2025-11-25 12:24:12.789244468 +0000 UTC m=+972.707329849" watchObservedRunningTime="2025-11-25 12:24:12.795045841 +0000 UTC m=+972.713131222" Nov 25 12:24:13 crc kubenswrapper[4693]: I1125 12:24:13.780699 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" event={"ID":"a7c4eb9b-38af-41da-872e-b3da515b2f88","Type":"ContainerStarted","Data":"81f51a4757df61cde05d7103c5afa05ab33ddd62b2f9070c92b333b26afb71d9"} Nov 25 12:24:13 crc kubenswrapper[4693]: I1125 12:24:13.780996 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:24:13 crc kubenswrapper[4693]: I1125 12:24:13.812949 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" podStartSLOduration=30.660988167 podStartE2EDuration="1m14.812933436s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:28.339222535 +0000 UTC m=+928.257307926" lastFinishedPulling="2025-11-25 12:24:12.491167814 +0000 UTC m=+972.409253195" observedRunningTime="2025-11-25 12:24:13.807591025 +0000 UTC m=+973.725676416" watchObservedRunningTime="2025-11-25 12:24:13.812933436 +0000 UTC m=+973.731018817" Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.805167 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" event={"ID":"c3a7c8cb-ac3c-43d3-b38d-0c3625c53196","Type":"ContainerStarted","Data":"31cce893a236b783f6525d5600f5e036c113846dd426c8fe16d5d37ab694bfe6"} Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.805751 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" event={"ID":"c3a7c8cb-ac3c-43d3-b38d-0c3625c53196","Type":"ContainerStarted","Data":"5cc961f810e4c360ff7afa41b8fd4044ac91c3a4062a28a1ef8eb78c708dd041"} Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.805937 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.806698 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" event={"ID":"9cc5c4a9-0119-48b6-a795-9f482b55278b","Type":"ContainerStarted","Data":"98f562f76f85fc439569aec3906d2175d30c8be267cc242d39a6d77c0ea98b82"} Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.807134 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.808924 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" event={"ID":"a64b0f5c-e6af-4903-925a-028aec5477fd","Type":"ContainerStarted","Data":"bce272b2911ea059916e4bede1bf5a458552c3ecb989d0f6e3311e11273a8a29"} Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.808972 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" event={"ID":"a64b0f5c-e6af-4903-925a-028aec5477fd","Type":"ContainerStarted","Data":"eabf73ef1d8ddebd6d00e1e3b35c56f3423f2a0ec131aeeb95d9da26f0729f3e"} Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.809174 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.811099 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" event={"ID":"7ecc8c23-d9b2-4d46-a8b0-76758035b267","Type":"ContainerStarted","Data":"23d9a6cea03b37a3e62a214adf6f0dd5830d1eba55c3049a688e20938198b594"} Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.811146 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" event={"ID":"7ecc8c23-d9b2-4d46-a8b0-76758035b267","Type":"ContainerStarted","Data":"ddb9b505fe4a4467877b656fa5b5cdbbc8fd54bcf2d5d3a6ae2c4bd1f705d3a8"} Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.812050 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.834164 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" podStartSLOduration=3.3224124489999998 podStartE2EDuration="1m16.834143178s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.924083017 +0000 UTC m=+900.842168388" lastFinishedPulling="2025-11-25 12:24:14.435813736 +0000 UTC m=+974.353899117" observedRunningTime="2025-11-25 12:24:15.827708426 +0000 UTC m=+975.745793847" watchObservedRunningTime="2025-11-25 12:24:15.834143178 +0000 UTC m=+975.752228559" Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.855546 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" podStartSLOduration=3.8116684149999998 podStartE2EDuration="1m17.85551814s" podCreationTimestamp="2025-11-25 12:22:58 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.764780272 +0000 UTC m=+900.682865653" lastFinishedPulling="2025-11-25 12:24:14.808629997 +0000 UTC m=+974.726715378" observedRunningTime="2025-11-25 12:24:15.848496352 +0000 UTC m=+975.766581743" watchObservedRunningTime="2025-11-25 12:24:15.85551814 +0000 UTC m=+975.773603541" Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.878973 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" podStartSLOduration=3.470524533 podStartE2EDuration="1m17.878955231s" podCreationTimestamp="2025-11-25 12:22:58 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.400920719 +0000 UTC m=+900.319006100" lastFinishedPulling="2025-11-25 12:24:14.809351417 +0000 UTC m=+974.727436798" observedRunningTime="2025-11-25 12:24:15.876154632 +0000 UTC m=+975.794240013" watchObservedRunningTime="2025-11-25 12:24:15.878955231 +0000 UTC m=+975.797040612" Nov 25 12:24:15 crc kubenswrapper[4693]: I1125 12:24:15.921682 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" podStartSLOduration=3.076463165 podStartE2EDuration="1m16.921665975s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.978209064 +0000 UTC m=+900.896294455" lastFinishedPulling="2025-11-25 12:24:14.823411884 +0000 UTC m=+974.741497265" observedRunningTime="2025-11-25 12:24:15.900956141 +0000 UTC m=+975.819041542" watchObservedRunningTime="2025-11-25 12:24:15.921665975 +0000 UTC m=+975.839751356" Nov 25 12:24:19 crc kubenswrapper[4693]: I1125 12:24:19.617997 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" Nov 25 12:24:19 crc kubenswrapper[4693]: I1125 12:24:19.680662 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" Nov 25 12:24:19 crc kubenswrapper[4693]: I1125 12:24:19.855623 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" event={"ID":"28782f20-4534-4137-b590-7a3b31c638b2","Type":"ContainerStarted","Data":"7c126151898ffab73ef187f632f68d5495628bc12473d8a45d20ebf6aa04f3be"} Nov 25 12:24:19 crc kubenswrapper[4693]: I1125 12:24:19.892593 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" podStartSLOduration=2.8363584939999997 podStartE2EDuration="1m20.892570442s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:01.014086093 +0000 UTC m=+900.932171474" lastFinishedPulling="2025-11-25 12:24:19.070298041 +0000 UTC m=+978.988383422" observedRunningTime="2025-11-25 12:24:19.883917538 +0000 UTC m=+979.802002919" watchObservedRunningTime="2025-11-25 12:24:19.892570442 +0000 UTC m=+979.810655823" Nov 25 12:24:20 crc kubenswrapper[4693]: I1125 12:24:20.863845 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" event={"ID":"b9227546-dcce-4b09-9311-19f844deb318","Type":"ContainerStarted","Data":"e5cea05e36425b591e72280fa35fa3487641dfa734cd90d11cbb2d4d1049ae36"} Nov 25 12:24:20 crc kubenswrapper[4693]: I1125 12:24:20.866037 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" event={"ID":"4ab70f55-282f-4509-bc36-71ef2fe4d35b","Type":"ContainerStarted","Data":"5190da2420808f4beac5565c3027e5ebaf9dfb9ad895a13b2235f0a3153ce379"} Nov 25 12:24:20 crc kubenswrapper[4693]: I1125 12:24:20.890731 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" podStartSLOduration=54.488151416 podStartE2EDuration="1m21.890710291s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.961784848 +0000 UTC m=+900.879870229" lastFinishedPulling="2025-11-25 12:23:28.364343723 +0000 UTC m=+928.282429104" observedRunningTime="2025-11-25 12:24:20.887308066 +0000 UTC m=+980.805393457" watchObservedRunningTime="2025-11-25 12:24:20.890710291 +0000 UTC m=+980.808795662" Nov 25 12:24:20 crc kubenswrapper[4693]: I1125 12:24:20.912387 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" podStartSLOduration=71.073253252 podStartE2EDuration="1m22.912339152s" podCreationTimestamp="2025-11-25 12:22:58 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.06363921 +0000 UTC m=+899.981724591" lastFinishedPulling="2025-11-25 12:23:11.9027251 +0000 UTC m=+911.820810491" observedRunningTime="2025-11-25 12:24:20.906932349 +0000 UTC m=+980.825017730" watchObservedRunningTime="2025-11-25 12:24:20.912339152 +0000 UTC m=+980.830424533" Nov 25 12:24:23 crc kubenswrapper[4693]: I1125 12:24:23.406257 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:24:23 crc kubenswrapper[4693]: I1125 12:24:23.893104 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" event={"ID":"3c29e8b9-57cf-4967-b5e2-a6af42c16099","Type":"ContainerStarted","Data":"1b475a4526266291b74b89bdbcbf88bca6546caa3fb51694b2e835d3f42a4414"} Nov 25 12:24:23 crc kubenswrapper[4693]: I1125 12:24:23.898144 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" event={"ID":"f6bc1c64-200f-492f-bad9-dfecd5687698","Type":"ContainerStarted","Data":"65da554e203caed58360190a8c8e12eb57e651e2ab91254798d01def109fb99d"} Nov 25 12:24:23 crc kubenswrapper[4693]: I1125 12:24:23.898189 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" event={"ID":"f6bc1c64-200f-492f-bad9-dfecd5687698","Type":"ContainerStarted","Data":"81ab856e9d9d58631859ef51fb4795f49c2b4aa5144303ea9f8681d5edf9bd3e"} Nov 25 12:24:23 crc kubenswrapper[4693]: I1125 12:24:23.898503 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" Nov 25 12:24:23 crc kubenswrapper[4693]: I1125 12:24:23.900678 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" event={"ID":"ef0b302b-05d0-4be3-85ad-7eb3d70cec36","Type":"ContainerStarted","Data":"2854ce7c169a87fefef1358e5b73cdf2e0c0c61740049fe380978b40b670927e"} Nov 25 12:24:23 crc kubenswrapper[4693]: I1125 12:24:23.900710 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" event={"ID":"ef0b302b-05d0-4be3-85ad-7eb3d70cec36","Type":"ContainerStarted","Data":"f4bff34b34e94d1610d7a484673bcf1170a3be8c9a7208ea7be1f78b05452b45"} Nov 25 12:24:23 crc kubenswrapper[4693]: I1125 12:24:23.901163 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" Nov 25 12:24:23 crc kubenswrapper[4693]: I1125 12:24:23.913311 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" podStartSLOduration=64.311320778 podStartE2EDuration="1m25.913294464s" podCreationTimestamp="2025-11-25 12:22:58 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.433673269 +0000 UTC m=+900.351758650" lastFinishedPulling="2025-11-25 12:23:22.035646955 +0000 UTC m=+921.953732336" observedRunningTime="2025-11-25 12:24:23.910899296 +0000 UTC m=+983.828984677" watchObservedRunningTime="2025-11-25 12:24:23.913294464 +0000 UTC m=+983.831379845" Nov 25 12:24:23 crc kubenswrapper[4693]: I1125 12:24:23.938754 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" podStartSLOduration=3.991029047 podStartE2EDuration="1m24.938737951s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:01.062781156 +0000 UTC m=+900.980866527" lastFinishedPulling="2025-11-25 12:24:22.01049001 +0000 UTC m=+981.928575431" observedRunningTime="2025-11-25 12:24:23.937234419 +0000 UTC m=+983.855319810" watchObservedRunningTime="2025-11-25 12:24:23.938737951 +0000 UTC m=+983.856823342" Nov 25 12:24:23 crc kubenswrapper[4693]: I1125 12:24:23.968023 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" podStartSLOduration=3.952255439 podStartE2EDuration="1m24.968006316s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.992668215 +0000 UTC m=+900.910753596" lastFinishedPulling="2025-11-25 12:24:22.008419092 +0000 UTC m=+981.926504473" observedRunningTime="2025-11-25 12:24:23.961848413 +0000 UTC m=+983.879933804" watchObservedRunningTime="2025-11-25 12:24:23.968006316 +0000 UTC m=+983.886091707" Nov 25 12:24:28 crc kubenswrapper[4693]: I1125 12:24:28.941295 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" event={"ID":"105791fd-407d-44a3-8fc8-af90e82b0f63","Type":"ContainerStarted","Data":"12d86e8aa72882509063f1cd74a6d24c85786c9a11199879e8818192e9d52fff"} Nov 25 12:24:28 crc kubenswrapper[4693]: I1125 12:24:28.942113 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" Nov 25 12:24:28 crc kubenswrapper[4693]: I1125 12:24:28.969874 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" podStartSLOduration=3.034543075 podStartE2EDuration="1m29.969854017s" podCreationTimestamp="2025-11-25 12:22:59 +0000 UTC" firstStartedPulling="2025-11-25 12:23:00.936403937 +0000 UTC m=+900.854489318" lastFinishedPulling="2025-11-25 12:24:27.871714839 +0000 UTC m=+987.789800260" observedRunningTime="2025-11-25 12:24:28.965514875 +0000 UTC m=+988.883600266" watchObservedRunningTime="2025-11-25 12:24:28.969854017 +0000 UTC m=+988.887939408" Nov 25 12:24:29 crc kubenswrapper[4693]: I1125 12:24:29.199259 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" Nov 25 12:24:29 crc kubenswrapper[4693]: I1125 12:24:29.586679 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" Nov 25 12:24:29 crc kubenswrapper[4693]: I1125 12:24:29.766474 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" Nov 25 12:24:29 crc kubenswrapper[4693]: I1125 12:24:29.900033 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" Nov 25 12:24:29 crc kubenswrapper[4693]: I1125 12:24:29.934664 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" Nov 25 12:24:30 crc kubenswrapper[4693]: I1125 12:24:30.268214 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" Nov 25 12:24:35 crc kubenswrapper[4693]: I1125 12:24:35.113819 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:24:35 crc kubenswrapper[4693]: I1125 12:24:35.114179 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:24:40 crc kubenswrapper[4693]: I1125 12:24:40.568248 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.104734 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-7hxl6"] Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.106407 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.112257 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.112495 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.112786 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-828mx" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.112888 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.126029 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-7hxl6"] Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.214079 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6584b49599-xbtds"] Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.215383 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.217820 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.221981 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-xbtds"] Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.288187 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tfqc\" (UniqueName: \"kubernetes.io/projected/4c623cf1-c8d2-4e9d-8923-74e3825296f3-kube-api-access-9tfqc\") pod \"dnsmasq-dns-7bdd77c89-7hxl6\" (UID: \"4c623cf1-c8d2-4e9d-8923-74e3825296f3\") " pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.288330 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c623cf1-c8d2-4e9d-8923-74e3825296f3-config\") pod \"dnsmasq-dns-7bdd77c89-7hxl6\" (UID: \"4c623cf1-c8d2-4e9d-8923-74e3825296f3\") " pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.389528 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c623cf1-c8d2-4e9d-8923-74e3825296f3-config\") pod \"dnsmasq-dns-7bdd77c89-7hxl6\" (UID: \"4c623cf1-c8d2-4e9d-8923-74e3825296f3\") " pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.389596 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-dns-svc\") pod \"dnsmasq-dns-6584b49599-xbtds\" (UID: \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\") " pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.389623 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-config\") pod \"dnsmasq-dns-6584b49599-xbtds\" (UID: \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\") " pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.389666 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmpkq\" (UniqueName: \"kubernetes.io/projected/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-kube-api-access-tmpkq\") pod \"dnsmasq-dns-6584b49599-xbtds\" (UID: \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\") " pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.389701 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tfqc\" (UniqueName: \"kubernetes.io/projected/4c623cf1-c8d2-4e9d-8923-74e3825296f3-kube-api-access-9tfqc\") pod \"dnsmasq-dns-7bdd77c89-7hxl6\" (UID: \"4c623cf1-c8d2-4e9d-8923-74e3825296f3\") " pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.390492 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c623cf1-c8d2-4e9d-8923-74e3825296f3-config\") pod \"dnsmasq-dns-7bdd77c89-7hxl6\" (UID: \"4c623cf1-c8d2-4e9d-8923-74e3825296f3\") " pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.412727 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tfqc\" (UniqueName: \"kubernetes.io/projected/4c623cf1-c8d2-4e9d-8923-74e3825296f3-kube-api-access-9tfqc\") pod \"dnsmasq-dns-7bdd77c89-7hxl6\" (UID: \"4c623cf1-c8d2-4e9d-8923-74e3825296f3\") " pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.423837 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.491215 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmpkq\" (UniqueName: \"kubernetes.io/projected/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-kube-api-access-tmpkq\") pod \"dnsmasq-dns-6584b49599-xbtds\" (UID: \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\") " pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.491306 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-dns-svc\") pod \"dnsmasq-dns-6584b49599-xbtds\" (UID: \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\") " pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.491335 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-config\") pod \"dnsmasq-dns-6584b49599-xbtds\" (UID: \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\") " pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.492248 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-config\") pod \"dnsmasq-dns-6584b49599-xbtds\" (UID: \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\") " pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.493296 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-dns-svc\") pod \"dnsmasq-dns-6584b49599-xbtds\" (UID: \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\") " pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.513792 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmpkq\" (UniqueName: \"kubernetes.io/projected/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-kube-api-access-tmpkq\") pod \"dnsmasq-dns-6584b49599-xbtds\" (UID: \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\") " pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.530793 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.889024 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-7hxl6"] Nov 25 12:24:54 crc kubenswrapper[4693]: I1125 12:24:54.965099 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-xbtds"] Nov 25 12:24:54 crc kubenswrapper[4693]: W1125 12:24:54.966656 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a7136b_8bdc_46a5_b26a_9bbe82cafded.slice/crio-96b99dbe417744b2fbd6af2688476f7857bda5d90b6451efb0647c6302c396c8 WatchSource:0}: Error finding container 96b99dbe417744b2fbd6af2688476f7857bda5d90b6451efb0647c6302c396c8: Status 404 returned error can't find the container with id 96b99dbe417744b2fbd6af2688476f7857bda5d90b6451efb0647c6302c396c8 Nov 25 12:24:55 crc kubenswrapper[4693]: I1125 12:24:55.117146 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6584b49599-xbtds" event={"ID":"d1a7136b-8bdc-46a5-b26a-9bbe82cafded","Type":"ContainerStarted","Data":"96b99dbe417744b2fbd6af2688476f7857bda5d90b6451efb0647c6302c396c8"} Nov 25 12:24:55 crc kubenswrapper[4693]: I1125 12:24:55.118472 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" event={"ID":"4c623cf1-c8d2-4e9d-8923-74e3825296f3","Type":"ContainerStarted","Data":"1f9249d852e4581cff97fe2ae88cb59ed8973ebf45ffdd4257b83e8ff1ce8ebb"} Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.290998 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-xbtds"] Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.320567 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-qg7p6"] Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.322078 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.337496 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-qg7p6"] Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.436423 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttr5x\" (UniqueName: \"kubernetes.io/projected/281ee04c-45b9-4b52-ace7-e5de8bcb2314-kube-api-access-ttr5x\") pod \"dnsmasq-dns-7c6d9948dc-qg7p6\" (UID: \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\") " pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.436477 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281ee04c-45b9-4b52-ace7-e5de8bcb2314-config\") pod \"dnsmasq-dns-7c6d9948dc-qg7p6\" (UID: \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\") " pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.436519 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281ee04c-45b9-4b52-ace7-e5de8bcb2314-dns-svc\") pod \"dnsmasq-dns-7c6d9948dc-qg7p6\" (UID: \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\") " pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.538998 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttr5x\" (UniqueName: \"kubernetes.io/projected/281ee04c-45b9-4b52-ace7-e5de8bcb2314-kube-api-access-ttr5x\") pod \"dnsmasq-dns-7c6d9948dc-qg7p6\" (UID: \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\") " pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.539312 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281ee04c-45b9-4b52-ace7-e5de8bcb2314-config\") pod \"dnsmasq-dns-7c6d9948dc-qg7p6\" (UID: \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\") " pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.539347 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281ee04c-45b9-4b52-ace7-e5de8bcb2314-dns-svc\") pod \"dnsmasq-dns-7c6d9948dc-qg7p6\" (UID: \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\") " pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.540403 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281ee04c-45b9-4b52-ace7-e5de8bcb2314-dns-svc\") pod \"dnsmasq-dns-7c6d9948dc-qg7p6\" (UID: \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\") " pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.543595 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281ee04c-45b9-4b52-ace7-e5de8bcb2314-config\") pod \"dnsmasq-dns-7c6d9948dc-qg7p6\" (UID: \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\") " pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.608337 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttr5x\" (UniqueName: \"kubernetes.io/projected/281ee04c-45b9-4b52-ace7-e5de8bcb2314-kube-api-access-ttr5x\") pod \"dnsmasq-dns-7c6d9948dc-qg7p6\" (UID: \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\") " pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.660777 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-7hxl6"] Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.671406 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.698204 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-4lr9b"] Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.701220 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.716809 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-4lr9b"] Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.846122 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f86ef62-ce38-42c9-a0bf-667726b714cc-dns-svc\") pod \"dnsmasq-dns-6486446b9f-4lr9b\" (UID: \"8f86ef62-ce38-42c9-a0bf-667726b714cc\") " pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.846189 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f86ef62-ce38-42c9-a0bf-667726b714cc-config\") pod \"dnsmasq-dns-6486446b9f-4lr9b\" (UID: \"8f86ef62-ce38-42c9-a0bf-667726b714cc\") " pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.846301 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqcjt\" (UniqueName: \"kubernetes.io/projected/8f86ef62-ce38-42c9-a0bf-667726b714cc-kube-api-access-kqcjt\") pod \"dnsmasq-dns-6486446b9f-4lr9b\" (UID: \"8f86ef62-ce38-42c9-a0bf-667726b714cc\") " pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.947261 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f86ef62-ce38-42c9-a0bf-667726b714cc-config\") pod \"dnsmasq-dns-6486446b9f-4lr9b\" (UID: \"8f86ef62-ce38-42c9-a0bf-667726b714cc\") " pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.947577 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqcjt\" (UniqueName: \"kubernetes.io/projected/8f86ef62-ce38-42c9-a0bf-667726b714cc-kube-api-access-kqcjt\") pod \"dnsmasq-dns-6486446b9f-4lr9b\" (UID: \"8f86ef62-ce38-42c9-a0bf-667726b714cc\") " pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.947617 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f86ef62-ce38-42c9-a0bf-667726b714cc-dns-svc\") pod \"dnsmasq-dns-6486446b9f-4lr9b\" (UID: \"8f86ef62-ce38-42c9-a0bf-667726b714cc\") " pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.948640 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f86ef62-ce38-42c9-a0bf-667726b714cc-dns-svc\") pod \"dnsmasq-dns-6486446b9f-4lr9b\" (UID: \"8f86ef62-ce38-42c9-a0bf-667726b714cc\") " pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.949901 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f86ef62-ce38-42c9-a0bf-667726b714cc-config\") pod \"dnsmasq-dns-6486446b9f-4lr9b\" (UID: \"8f86ef62-ce38-42c9-a0bf-667726b714cc\") " pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:24:57 crc kubenswrapper[4693]: I1125 12:24:57.980609 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqcjt\" (UniqueName: \"kubernetes.io/projected/8f86ef62-ce38-42c9-a0bf-667726b714cc-kube-api-access-kqcjt\") pod \"dnsmasq-dns-6486446b9f-4lr9b\" (UID: \"8f86ef62-ce38-42c9-a0bf-667726b714cc\") " pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.100707 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.246253 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-qg7p6"] Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.483065 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.488563 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.491209 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.492081 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-drzcq" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.492264 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.492486 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.492676 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.492691 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.493804 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.497224 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.650581 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-4lr9b"] Nov 25 12:24:58 crc kubenswrapper[4693]: W1125 12:24:58.657726 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f86ef62_ce38_42c9_a0bf_667726b714cc.slice/crio-5536230188e274d5700963fc534400241e22f91bbf706fe30591055e9637aa28 WatchSource:0}: Error finding container 5536230188e274d5700963fc534400241e22f91bbf706fe30591055e9637aa28: Status 404 returned error can't find the container with id 5536230188e274d5700963fc534400241e22f91bbf706fe30591055e9637aa28 Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.658534 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dcb107e2-5742-4030-a7fc-a8eb016f449b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.658580 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.658612 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.658695 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.658717 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.658751 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.658883 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-config-data\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.658914 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.658962 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dcb107e2-5742-4030-a7fc-a8eb016f449b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.658984 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs9mj\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-kube-api-access-bs9mj\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.659031 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.760695 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dcb107e2-5742-4030-a7fc-a8eb016f449b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.760751 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs9mj\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-kube-api-access-bs9mj\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.760787 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.760810 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dcb107e2-5742-4030-a7fc-a8eb016f449b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.760837 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.760864 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.760902 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.760923 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.760955 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.760990 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-config-data\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.761012 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.763155 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.763503 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.763917 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.764062 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.764273 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-config-data\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.767332 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.773148 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.776350 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dcb107e2-5742-4030-a7fc-a8eb016f449b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.781202 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dcb107e2-5742-4030-a7fc-a8eb016f449b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.781207 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.784236 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs9mj\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-kube-api-access-bs9mj\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.787665 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.814855 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.832064 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.833844 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.833956 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.837522 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f892l" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.837550 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.837558 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.837607 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.837565 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.837726 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.838757 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.964533 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vpzc\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-kube-api-access-2vpzc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.964593 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cd38986-be2a-4adf-b594-352740498acd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.964633 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cd38986-be2a-4adf-b594-352740498acd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.964791 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.964828 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.964866 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.964893 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.964931 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.964959 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.965002 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:58 crc kubenswrapper[4693]: I1125 12:24:58.965099 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.066563 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.066629 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.066660 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.066693 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.066742 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.066781 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vpzc\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-kube-api-access-2vpzc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.066816 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cd38986-be2a-4adf-b594-352740498acd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.066851 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cd38986-be2a-4adf-b594-352740498acd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.066880 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.066913 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.066956 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.067210 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.067463 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.068216 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.069095 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.069181 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.069944 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.073091 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cd38986-be2a-4adf-b594-352740498acd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.074769 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cd38986-be2a-4adf-b594-352740498acd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.085527 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.085613 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.096918 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vpzc\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-kube-api-access-2vpzc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.103001 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.186977 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" event={"ID":"8f86ef62-ce38-42c9-a0bf-667726b714cc","Type":"ContainerStarted","Data":"5536230188e274d5700963fc534400241e22f91bbf706fe30591055e9637aa28"} Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.189120 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" event={"ID":"281ee04c-45b9-4b52-ace7-e5de8bcb2314","Type":"ContainerStarted","Data":"ca8c9f2f7edc12aa519cd452fd30ab2ed2c0f4019a48a97bb8283d017f55534a"} Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.194896 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.374716 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 12:24:59 crc kubenswrapper[4693]: W1125 12:24:59.393897 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcb107e2_5742_4030_a7fc_a8eb016f449b.slice/crio-1337c98d3719b6e7375b663fba3337f9dd6a61b07d8a63dd7ee90fcea7aa96ad WatchSource:0}: Error finding container 1337c98d3719b6e7375b663fba3337f9dd6a61b07d8a63dd7ee90fcea7aa96ad: Status 404 returned error can't find the container with id 1337c98d3719b6e7375b663fba3337f9dd6a61b07d8a63dd7ee90fcea7aa96ad Nov 25 12:24:59 crc kubenswrapper[4693]: I1125 12:24:59.711573 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.221238 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4cd38986-be2a-4adf-b594-352740498acd","Type":"ContainerStarted","Data":"2e200845d976e19a3f2eba053630629d292c18d961b86603a944d2d04ec52c74"} Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.225328 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dcb107e2-5742-4030-a7fc-a8eb016f449b","Type":"ContainerStarted","Data":"1337c98d3719b6e7375b663fba3337f9dd6a61b07d8a63dd7ee90fcea7aa96ad"} Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.255995 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.257182 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.259261 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.259642 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.260790 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.260817 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lfxmx" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.265726 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.265913 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.393551 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.393869 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-config-data-default\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.393889 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.393906 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.394104 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmlc2\" (UniqueName: \"kubernetes.io/projected/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-kube-api-access-kmlc2\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.394189 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.394229 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-kolla-config\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.394365 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.495754 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.495812 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.495835 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-config-data-default\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.496282 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.496633 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.497352 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.497446 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.497553 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmlc2\" (UniqueName: \"kubernetes.io/projected/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-kube-api-access-kmlc2\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.497610 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.497653 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-kolla-config\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.498251 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-config-data-default\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.498416 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-kolla-config\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.500043 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.510666 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.516170 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.529925 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmlc2\" (UniqueName: \"kubernetes.io/projected/4cf2be5d-1c6c-402f-bf93-e9653a6a84cd-kube-api-access-kmlc2\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.602337 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd\") " pod="openstack/openstack-galera-0" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.887052 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lfxmx" Nov 25 12:25:00 crc kubenswrapper[4693]: I1125 12:25:00.896260 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.489434 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 25 12:25:01 crc kubenswrapper[4693]: W1125 12:25:01.506340 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cf2be5d_1c6c_402f_bf93_e9653a6a84cd.slice/crio-7ab8bb3889a38d31fb944fd5f92a1694aa87f2ac76b330ab2d4e208045927524 WatchSource:0}: Error finding container 7ab8bb3889a38d31fb944fd5f92a1694aa87f2ac76b330ab2d4e208045927524: Status 404 returned error can't find the container with id 7ab8bb3889a38d31fb944fd5f92a1694aa87f2ac76b330ab2d4e208045927524 Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.647999 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.651629 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.658948 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.659068 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.659438 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.659742 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-nl7sc" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.681477 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.838128 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.838211 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.838252 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.838302 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.838335 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.838394 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.840412 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.842335 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.843198 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfww8\" (UniqueName: \"kubernetes.io/projected/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-kube-api-access-hfww8\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.843278 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.846149 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-949sh" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.846428 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.848278 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.883352 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.944793 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/459f5353-15bd-4139-a363-7a1bf6fe94cf-memcached-tls-certs\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.944861 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.944880 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/459f5353-15bd-4139-a363-7a1bf6fe94cf-kolla-config\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.944907 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfww8\" (UniqueName: \"kubernetes.io/projected/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-kube-api-access-hfww8\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.944959 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zkts\" (UniqueName: \"kubernetes.io/projected/459f5353-15bd-4139-a363-7a1bf6fe94cf-kube-api-access-8zkts\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.944984 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.945028 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/459f5353-15bd-4139-a363-7a1bf6fe94cf-config-data\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.945102 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.945149 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459f5353-15bd-4139-a363-7a1bf6fe94cf-combined-ca-bundle\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.945197 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.945230 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.945306 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.945356 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.947882 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.948782 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.949604 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.949705 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.950312 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.955796 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.956842 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.977160 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfww8\" (UniqueName: \"kubernetes.io/projected/9fc3b8be-d4cc-4bb4-86f0-5516294c1221-kube-api-access-hfww8\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.980260 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"9fc3b8be-d4cc-4bb4-86f0-5516294c1221\") " pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:01 crc kubenswrapper[4693]: I1125 12:25:01.995982 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:02 crc kubenswrapper[4693]: I1125 12:25:02.048265 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/459f5353-15bd-4139-a363-7a1bf6fe94cf-config-data\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:02 crc kubenswrapper[4693]: I1125 12:25:02.048477 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459f5353-15bd-4139-a363-7a1bf6fe94cf-combined-ca-bundle\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:02 crc kubenswrapper[4693]: I1125 12:25:02.049313 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/459f5353-15bd-4139-a363-7a1bf6fe94cf-config-data\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:02 crc kubenswrapper[4693]: I1125 12:25:02.050070 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/459f5353-15bd-4139-a363-7a1bf6fe94cf-memcached-tls-certs\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:02 crc kubenswrapper[4693]: I1125 12:25:02.050676 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/459f5353-15bd-4139-a363-7a1bf6fe94cf-kolla-config\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:02 crc kubenswrapper[4693]: I1125 12:25:02.052401 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zkts\" (UniqueName: \"kubernetes.io/projected/459f5353-15bd-4139-a363-7a1bf6fe94cf-kube-api-access-8zkts\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:02 crc kubenswrapper[4693]: I1125 12:25:02.056016 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/459f5353-15bd-4139-a363-7a1bf6fe94cf-kolla-config\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:02 crc kubenswrapper[4693]: I1125 12:25:02.061669 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/459f5353-15bd-4139-a363-7a1bf6fe94cf-memcached-tls-certs\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:02 crc kubenswrapper[4693]: I1125 12:25:02.073190 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/459f5353-15bd-4139-a363-7a1bf6fe94cf-combined-ca-bundle\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:02 crc kubenswrapper[4693]: I1125 12:25:02.085992 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zkts\" (UniqueName: \"kubernetes.io/projected/459f5353-15bd-4139-a363-7a1bf6fe94cf-kube-api-access-8zkts\") pod \"memcached-0\" (UID: \"459f5353-15bd-4139-a363-7a1bf6fe94cf\") " pod="openstack/memcached-0" Nov 25 12:25:02 crc kubenswrapper[4693]: I1125 12:25:02.180971 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 25 12:25:02 crc kubenswrapper[4693]: I1125 12:25:02.258780 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd","Type":"ContainerStarted","Data":"7ab8bb3889a38d31fb944fd5f92a1694aa87f2ac76b330ab2d4e208045927524"} Nov 25 12:25:03 crc kubenswrapper[4693]: I1125 12:25:03.611063 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 12:25:03 crc kubenswrapper[4693]: I1125 12:25:03.612270 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 12:25:03 crc kubenswrapper[4693]: I1125 12:25:03.614595 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-gk7lv" Nov 25 12:25:03 crc kubenswrapper[4693]: I1125 12:25:03.618870 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 12:25:03 crc kubenswrapper[4693]: I1125 12:25:03.785817 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs54q\" (UniqueName: \"kubernetes.io/projected/808a4dcd-a02a-4dd6-a797-5932896d3482-kube-api-access-hs54q\") pod \"kube-state-metrics-0\" (UID: \"808a4dcd-a02a-4dd6-a797-5932896d3482\") " pod="openstack/kube-state-metrics-0" Nov 25 12:25:03 crc kubenswrapper[4693]: I1125 12:25:03.887500 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs54q\" (UniqueName: \"kubernetes.io/projected/808a4dcd-a02a-4dd6-a797-5932896d3482-kube-api-access-hs54q\") pod \"kube-state-metrics-0\" (UID: \"808a4dcd-a02a-4dd6-a797-5932896d3482\") " pod="openstack/kube-state-metrics-0" Nov 25 12:25:03 crc kubenswrapper[4693]: I1125 12:25:03.911443 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs54q\" (UniqueName: \"kubernetes.io/projected/808a4dcd-a02a-4dd6-a797-5932896d3482-kube-api-access-hs54q\") pod \"kube-state-metrics-0\" (UID: \"808a4dcd-a02a-4dd6-a797-5932896d3482\") " pod="openstack/kube-state-metrics-0" Nov 25 12:25:04 crc kubenswrapper[4693]: I1125 12:25:04.016180 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 12:25:05 crc kubenswrapper[4693]: I1125 12:25:05.114185 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:25:05 crc kubenswrapper[4693]: I1125 12:25:05.114250 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:25:05 crc kubenswrapper[4693]: I1125 12:25:05.114295 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:25:05 crc kubenswrapper[4693]: I1125 12:25:05.114835 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f1602027df59cd76a649d636d394ab648e039f6efe47c91bfe119cadecb3b352"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 12:25:05 crc kubenswrapper[4693]: I1125 12:25:05.114904 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://f1602027df59cd76a649d636d394ab648e039f6efe47c91bfe119cadecb3b352" gracePeriod=600 Nov 25 12:25:05 crc kubenswrapper[4693]: I1125 12:25:05.284964 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="f1602027df59cd76a649d636d394ab648e039f6efe47c91bfe119cadecb3b352" exitCode=0 Nov 25 12:25:05 crc kubenswrapper[4693]: I1125 12:25:05.285022 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"f1602027df59cd76a649d636d394ab648e039f6efe47c91bfe119cadecb3b352"} Nov 25 12:25:05 crc kubenswrapper[4693]: I1125 12:25:05.285058 4693 scope.go:117] "RemoveContainer" containerID="c48b1bd5f615c180301fa268ce0ea0e2b9ab9ea9e6d73443257071ddeda6d194" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.079710 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ndgsx"] Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.081322 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.084761 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.084811 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.085153 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-n2jf7" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.086349 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8vhnn"] Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.087792 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.092963 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ndgsx"] Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.106819 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8vhnn"] Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.158863 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/96236f54-53d2-47df-854b-51addeda1dee-var-log-ovn\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.158937 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96236f54-53d2-47df-854b-51addeda1dee-combined-ca-bundle\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.158960 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/93d2601b-fc82-478d-8667-dbce77606f4d-var-lib\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.158983 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/93d2601b-fc82-478d-8667-dbce77606f4d-etc-ovs\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.159008 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prl8g\" (UniqueName: \"kubernetes.io/projected/96236f54-53d2-47df-854b-51addeda1dee-kube-api-access-prl8g\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.159078 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93d2601b-fc82-478d-8667-dbce77606f4d-scripts\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.159099 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/96236f54-53d2-47df-854b-51addeda1dee-var-run\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.159114 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/96236f54-53d2-47df-854b-51addeda1dee-var-run-ovn\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.159143 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtcsv\" (UniqueName: \"kubernetes.io/projected/93d2601b-fc82-478d-8667-dbce77606f4d-kube-api-access-wtcsv\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.159165 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/93d2601b-fc82-478d-8667-dbce77606f4d-var-log\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.159183 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96236f54-53d2-47df-854b-51addeda1dee-scripts\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.159223 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/96236f54-53d2-47df-854b-51addeda1dee-ovn-controller-tls-certs\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.159241 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/93d2601b-fc82-478d-8667-dbce77606f4d-var-run\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.260565 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/96236f54-53d2-47df-854b-51addeda1dee-ovn-controller-tls-certs\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.260676 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/93d2601b-fc82-478d-8667-dbce77606f4d-var-run\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.260709 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/96236f54-53d2-47df-854b-51addeda1dee-var-log-ovn\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.260743 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96236f54-53d2-47df-854b-51addeda1dee-combined-ca-bundle\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.260766 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/93d2601b-fc82-478d-8667-dbce77606f4d-var-lib\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.260799 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/93d2601b-fc82-478d-8667-dbce77606f4d-etc-ovs\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.260827 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prl8g\" (UniqueName: \"kubernetes.io/projected/96236f54-53d2-47df-854b-51addeda1dee-kube-api-access-prl8g\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.260891 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93d2601b-fc82-478d-8667-dbce77606f4d-scripts\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.260920 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/96236f54-53d2-47df-854b-51addeda1dee-var-run\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.260947 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/96236f54-53d2-47df-854b-51addeda1dee-var-run-ovn\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.260987 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtcsv\" (UniqueName: \"kubernetes.io/projected/93d2601b-fc82-478d-8667-dbce77606f4d-kube-api-access-wtcsv\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.261016 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/93d2601b-fc82-478d-8667-dbce77606f4d-var-log\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.261039 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96236f54-53d2-47df-854b-51addeda1dee-scripts\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.261391 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/93d2601b-fc82-478d-8667-dbce77606f4d-var-run\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.261473 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/93d2601b-fc82-478d-8667-dbce77606f4d-etc-ovs\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.261521 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/96236f54-53d2-47df-854b-51addeda1dee-var-run-ovn\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.261575 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/96236f54-53d2-47df-854b-51addeda1dee-var-run\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.261618 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/93d2601b-fc82-478d-8667-dbce77606f4d-var-lib\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.261692 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/93d2601b-fc82-478d-8667-dbce77606f4d-var-log\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.262160 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/96236f54-53d2-47df-854b-51addeda1dee-var-log-ovn\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.264265 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/93d2601b-fc82-478d-8667-dbce77606f4d-scripts\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.264891 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96236f54-53d2-47df-854b-51addeda1dee-scripts\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.268120 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96236f54-53d2-47df-854b-51addeda1dee-combined-ca-bundle\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.272143 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/96236f54-53d2-47df-854b-51addeda1dee-ovn-controller-tls-certs\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.290478 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtcsv\" (UniqueName: \"kubernetes.io/projected/93d2601b-fc82-478d-8667-dbce77606f4d-kube-api-access-wtcsv\") pod \"ovn-controller-ovs-8vhnn\" (UID: \"93d2601b-fc82-478d-8667-dbce77606f4d\") " pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.295013 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prl8g\" (UniqueName: \"kubernetes.io/projected/96236f54-53d2-47df-854b-51addeda1dee-kube-api-access-prl8g\") pod \"ovn-controller-ndgsx\" (UID: \"96236f54-53d2-47df-854b-51addeda1dee\") " pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.404720 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.417215 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.980693 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.982222 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.985689 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6lvzb" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.985838 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.986230 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.991810 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.991934 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 25 12:25:08 crc kubenswrapper[4693]: I1125 12:25:08.999024 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.175717 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d617274-42b9-4d07-b321-d70a5aeba8ee-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.177002 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d617274-42b9-4d07-b321-d70a5aeba8ee-config\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.177122 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tvhz\" (UniqueName: \"kubernetes.io/projected/4d617274-42b9-4d07-b321-d70a5aeba8ee-kube-api-access-4tvhz\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.177274 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.177399 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d617274-42b9-4d07-b321-d70a5aeba8ee-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.177508 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d617274-42b9-4d07-b321-d70a5aeba8ee-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.177590 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d617274-42b9-4d07-b321-d70a5aeba8ee-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.177686 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d617274-42b9-4d07-b321-d70a5aeba8ee-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.279338 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.279478 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d617274-42b9-4d07-b321-d70a5aeba8ee-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.279501 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d617274-42b9-4d07-b321-d70a5aeba8ee-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.279519 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d617274-42b9-4d07-b321-d70a5aeba8ee-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.279560 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d617274-42b9-4d07-b321-d70a5aeba8ee-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.279631 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d617274-42b9-4d07-b321-d70a5aeba8ee-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.279665 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d617274-42b9-4d07-b321-d70a5aeba8ee-config\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.281099 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tvhz\" (UniqueName: \"kubernetes.io/projected/4d617274-42b9-4d07-b321-d70a5aeba8ee-kube-api-access-4tvhz\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.280101 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4d617274-42b9-4d07-b321-d70a5aeba8ee-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.281033 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d617274-42b9-4d07-b321-d70a5aeba8ee-config\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.281040 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d617274-42b9-4d07-b321-d70a5aeba8ee-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.279769 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.284889 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d617274-42b9-4d07-b321-d70a5aeba8ee-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.284892 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d617274-42b9-4d07-b321-d70a5aeba8ee-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.284890 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d617274-42b9-4d07-b321-d70a5aeba8ee-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.298497 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tvhz\" (UniqueName: \"kubernetes.io/projected/4d617274-42b9-4d07-b321-d70a5aeba8ee-kube-api-access-4tvhz\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.304776 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4d617274-42b9-4d07-b321-d70a5aeba8ee\") " pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:09 crc kubenswrapper[4693]: I1125 12:25:09.309444 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.593796 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.596487 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.601655 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.601844 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.602154 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.601983 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-rq5lp" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.620594 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.702781 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89a481b1-6040-4f15-a63f-d6d2301c3534-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.702837 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89a481b1-6040-4f15-a63f-d6d2301c3534-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.702855 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5spbg\" (UniqueName: \"kubernetes.io/projected/89a481b1-6040-4f15-a63f-d6d2301c3534-kube-api-access-5spbg\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.702880 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.702912 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a481b1-6040-4f15-a63f-d6d2301c3534-config\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.703007 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/89a481b1-6040-4f15-a63f-d6d2301c3534-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.703022 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/89a481b1-6040-4f15-a63f-d6d2301c3534-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.703046 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a481b1-6040-4f15-a63f-d6d2301c3534-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.804525 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a481b1-6040-4f15-a63f-d6d2301c3534-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.804648 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89a481b1-6040-4f15-a63f-d6d2301c3534-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.804682 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89a481b1-6040-4f15-a63f-d6d2301c3534-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.804706 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5spbg\" (UniqueName: \"kubernetes.io/projected/89a481b1-6040-4f15-a63f-d6d2301c3534-kube-api-access-5spbg\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.804731 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.804766 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a481b1-6040-4f15-a63f-d6d2301c3534-config\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.804826 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/89a481b1-6040-4f15-a63f-d6d2301c3534-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.804845 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/89a481b1-6040-4f15-a63f-d6d2301c3534-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.805606 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.805662 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/89a481b1-6040-4f15-a63f-d6d2301c3534-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.806457 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89a481b1-6040-4f15-a63f-d6d2301c3534-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.807363 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89a481b1-6040-4f15-a63f-d6d2301c3534-config\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.810285 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/89a481b1-6040-4f15-a63f-d6d2301c3534-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.814433 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a481b1-6040-4f15-a63f-d6d2301c3534-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.815173 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89a481b1-6040-4f15-a63f-d6d2301c3534-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.824248 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5spbg\" (UniqueName: \"kubernetes.io/projected/89a481b1-6040-4f15-a63f-d6d2301c3534-kube-api-access-5spbg\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.831153 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"89a481b1-6040-4f15-a63f-d6d2301c3534\") " pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:10 crc kubenswrapper[4693]: I1125 12:25:10.923679 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:25 crc kubenswrapper[4693]: E1125 12:25:25.061805 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce" Nov 25 12:25:25 crc kubenswrapper[4693]: E1125 12:25:25.062595 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kmlc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(4cf2be5d-1c6c-402f-bf93-e9653a6a84cd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:25:25 crc kubenswrapper[4693]: E1125 12:25:25.063852 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="4cf2be5d-1c6c-402f-bf93-e9653a6a84cd" Nov 25 12:25:25 crc kubenswrapper[4693]: E1125 12:25:25.426497 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb@sha256:10452e2144368e2f128c8fb8ef9e54880b06ef1d71d9f084a0217dcb099c51ce\\\"\"" pod="openstack/openstack-galera-0" podUID="4cf2be5d-1c6c-402f-bf93-e9653a6a84cd" Nov 25 12:25:25 crc kubenswrapper[4693]: I1125 12:25:25.474918 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.101310 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba" Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.101848 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9tfqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7bdd77c89-7hxl6_openstack(4c623cf1-c8d2-4e9d-8923-74e3825296f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.103739 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" podUID="4c623cf1-c8d2-4e9d-8923-74e3825296f3" Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.189984 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba" Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.190426 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmpkq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6584b49599-xbtds_openstack(d1a7136b-8bdc-46a5-b26a-9bbe82cafded): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.191934 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6584b49599-xbtds" podUID="d1a7136b-8bdc-46a5-b26a-9bbe82cafded" Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.215324 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba" Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.215759 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ttr5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7c6d9948dc-qg7p6_openstack(281ee04c-45b9-4b52-ace7-e5de8bcb2314): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.216974 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" podUID="281ee04c-45b9-4b52-ace7-e5de8bcb2314" Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.254114 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba" Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.254463 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kqcjt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6486446b9f-4lr9b_openstack(8f86ef62-ce38-42c9-a0bf-667726b714cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.256502 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" podUID="8f86ef62-ce38-42c9-a0bf-667726b714cc" Nov 25 12:25:26 crc kubenswrapper[4693]: I1125 12:25:26.430443 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"459f5353-15bd-4139-a363-7a1bf6fe94cf","Type":"ContainerStarted","Data":"deab75e5594802972da2296816b4f2d7f10c017c1749fa1b1cedced79595714f"} Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.432269 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba\\\"\"" pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" podUID="8f86ef62-ce38-42c9-a0bf-667726b714cc" Nov 25 12:25:26 crc kubenswrapper[4693]: E1125 12:25:26.432998 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba\\\"\"" pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" podUID="281ee04c-45b9-4b52-ace7-e5de8bcb2314" Nov 25 12:25:26 crc kubenswrapper[4693]: I1125 12:25:26.582758 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 12:25:26 crc kubenswrapper[4693]: I1125 12:25:26.684959 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 25 12:25:26 crc kubenswrapper[4693]: I1125 12:25:26.695404 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ndgsx"] Nov 25 12:25:26 crc kubenswrapper[4693]: W1125 12:25:26.699269 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96236f54_53d2_47df_854b_51addeda1dee.slice/crio-5487f2b1174a9228edd22e1f34d409ca46e4b11707fa18ca81a4ada850566711 WatchSource:0}: Error finding container 5487f2b1174a9228edd22e1f34d409ca46e4b11707fa18ca81a4ada850566711: Status 404 returned error can't find the container with id 5487f2b1174a9228edd22e1f34d409ca46e4b11707fa18ca81a4ada850566711 Nov 25 12:25:26 crc kubenswrapper[4693]: W1125 12:25:26.704696 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fc3b8be_d4cc_4bb4_86f0_5516294c1221.slice/crio-cb2c679f569ddd7a667ba71030dbf0da1a8c940047391f780edffeeca19fe027 WatchSource:0}: Error finding container cb2c679f569ddd7a667ba71030dbf0da1a8c940047391f780edffeeca19fe027: Status 404 returned error can't find the container with id cb2c679f569ddd7a667ba71030dbf0da1a8c940047391f780edffeeca19fe027 Nov 25 12:25:26 crc kubenswrapper[4693]: I1125 12:25:26.720833 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 25 12:25:26 crc kubenswrapper[4693]: I1125 12:25:26.857112 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8vhnn"] Nov 25 12:25:26 crc kubenswrapper[4693]: I1125 12:25:26.928727 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 25 12:25:26 crc kubenswrapper[4693]: I1125 12:25:26.934005 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" Nov 25 12:25:26 crc kubenswrapper[4693]: W1125 12:25:26.946817 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d617274_42b9_4d07_b321_d70a5aeba8ee.slice/crio-8af7e04582a508666d85739c2dae5852f84dbf57c22a2fa829f4bd373314e970 WatchSource:0}: Error finding container 8af7e04582a508666d85739c2dae5852f84dbf57c22a2fa829f4bd373314e970: Status 404 returned error can't find the container with id 8af7e04582a508666d85739c2dae5852f84dbf57c22a2fa829f4bd373314e970 Nov 25 12:25:26 crc kubenswrapper[4693]: I1125 12:25:26.949236 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.089456 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tfqc\" (UniqueName: \"kubernetes.io/projected/4c623cf1-c8d2-4e9d-8923-74e3825296f3-kube-api-access-9tfqc\") pod \"4c623cf1-c8d2-4e9d-8923-74e3825296f3\" (UID: \"4c623cf1-c8d2-4e9d-8923-74e3825296f3\") " Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.089620 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-dns-svc\") pod \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\" (UID: \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\") " Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.089688 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmpkq\" (UniqueName: \"kubernetes.io/projected/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-kube-api-access-tmpkq\") pod \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\" (UID: \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\") " Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.089716 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c623cf1-c8d2-4e9d-8923-74e3825296f3-config\") pod \"4c623cf1-c8d2-4e9d-8923-74e3825296f3\" (UID: \"4c623cf1-c8d2-4e9d-8923-74e3825296f3\") " Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.089768 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-config\") pod \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\" (UID: \"d1a7136b-8bdc-46a5-b26a-9bbe82cafded\") " Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.090395 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c623cf1-c8d2-4e9d-8923-74e3825296f3-config" (OuterVolumeSpecName: "config") pod "4c623cf1-c8d2-4e9d-8923-74e3825296f3" (UID: "4c623cf1-c8d2-4e9d-8923-74e3825296f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.090560 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-config" (OuterVolumeSpecName: "config") pod "d1a7136b-8bdc-46a5-b26a-9bbe82cafded" (UID: "d1a7136b-8bdc-46a5-b26a-9bbe82cafded"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.090743 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d1a7136b-8bdc-46a5-b26a-9bbe82cafded" (UID: "d1a7136b-8bdc-46a5-b26a-9bbe82cafded"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.118355 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-kube-api-access-tmpkq" (OuterVolumeSpecName: "kube-api-access-tmpkq") pod "d1a7136b-8bdc-46a5-b26a-9bbe82cafded" (UID: "d1a7136b-8bdc-46a5-b26a-9bbe82cafded"). InnerVolumeSpecName "kube-api-access-tmpkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.120341 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c623cf1-c8d2-4e9d-8923-74e3825296f3-kube-api-access-9tfqc" (OuterVolumeSpecName: "kube-api-access-9tfqc") pod "4c623cf1-c8d2-4e9d-8923-74e3825296f3" (UID: "4c623cf1-c8d2-4e9d-8923-74e3825296f3"). InnerVolumeSpecName "kube-api-access-9tfqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.191231 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.191639 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmpkq\" (UniqueName: \"kubernetes.io/projected/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-kube-api-access-tmpkq\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.191653 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c623cf1-c8d2-4e9d-8923-74e3825296f3-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.191670 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1a7136b-8bdc-46a5-b26a-9bbe82cafded-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.191681 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tfqc\" (UniqueName: \"kubernetes.io/projected/4c623cf1-c8d2-4e9d-8923-74e3825296f3-kube-api-access-9tfqc\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.440057 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8vhnn" event={"ID":"93d2601b-fc82-478d-8667-dbce77606f4d","Type":"ContainerStarted","Data":"dcf5d69c0e168fb3affb4e2066ca06a35e098c78289b012fa71d96043dbd1d2c"} Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.444283 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ndgsx" event={"ID":"96236f54-53d2-47df-854b-51addeda1dee","Type":"ContainerStarted","Data":"5487f2b1174a9228edd22e1f34d409ca46e4b11707fa18ca81a4ada850566711"} Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.446955 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"89a481b1-6040-4f15-a63f-d6d2301c3534","Type":"ContainerStarted","Data":"0ad46b7365f1d4be94b6e87cf5489ce80a0051e70644c0800720903e5844c5c2"} Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.449072 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" event={"ID":"4c623cf1-c8d2-4e9d-8923-74e3825296f3","Type":"ContainerDied","Data":"1f9249d852e4581cff97fe2ae88cb59ed8973ebf45ffdd4257b83e8ff1ce8ebb"} Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.449134 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdd77c89-7hxl6" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.456107 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"245f737f203c8007cd386e41d5f986e5bdb4a5f145f31a6ec9ef66e36fb73a9f"} Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.459765 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"808a4dcd-a02a-4dd6-a797-5932896d3482","Type":"ContainerStarted","Data":"7057a85b4c453f3eedae73f4af96f256b95190f827144ae668100f2125a3ee87"} Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.463308 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9fc3b8be-d4cc-4bb4-86f0-5516294c1221","Type":"ContainerStarted","Data":"cb2c679f569ddd7a667ba71030dbf0da1a8c940047391f780edffeeca19fe027"} Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.464350 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6584b49599-xbtds" event={"ID":"d1a7136b-8bdc-46a5-b26a-9bbe82cafded","Type":"ContainerDied","Data":"96b99dbe417744b2fbd6af2688476f7857bda5d90b6451efb0647c6302c396c8"} Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.464449 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6584b49599-xbtds" Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.470191 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4d617274-42b9-4d07-b321-d70a5aeba8ee","Type":"ContainerStarted","Data":"8af7e04582a508666d85739c2dae5852f84dbf57c22a2fa829f4bd373314e970"} Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.518962 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-7hxl6"] Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.525235 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bdd77c89-7hxl6"] Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.549304 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-xbtds"] Nov 25 12:25:27 crc kubenswrapper[4693]: I1125 12:25:27.555689 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6584b49599-xbtds"] Nov 25 12:25:28 crc kubenswrapper[4693]: I1125 12:25:28.478223 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4cd38986-be2a-4adf-b594-352740498acd","Type":"ContainerStarted","Data":"dd269fd355a10e4218bec3961a1dc96d8230751cf324653c82a3fbf6e7cc6e89"} Nov 25 12:25:28 crc kubenswrapper[4693]: I1125 12:25:28.482455 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dcb107e2-5742-4030-a7fc-a8eb016f449b","Type":"ContainerStarted","Data":"20aea4082df2dee881c850d7220bcf6413466cc6d9622f80568f4609d4cda435"} Nov 25 12:25:28 crc kubenswrapper[4693]: I1125 12:25:28.822977 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c623cf1-c8d2-4e9d-8923-74e3825296f3" path="/var/lib/kubelet/pods/4c623cf1-c8d2-4e9d-8923-74e3825296f3/volumes" Nov 25 12:25:28 crc kubenswrapper[4693]: I1125 12:25:28.823440 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a7136b-8bdc-46a5-b26a-9bbe82cafded" path="/var/lib/kubelet/pods/d1a7136b-8bdc-46a5-b26a-9bbe82cafded/volumes" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.395837 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-szpg5"] Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.401689 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.405075 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.411074 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-szpg5"] Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.511840 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-4lr9b"] Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.537127 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65c78595c5-lglfk"] Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.538356 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.549904 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.570629 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c78595c5-lglfk"] Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.585573 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/266234f1-8683-4a0d-a1ec-42cd82184f11-combined-ca-bundle\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.585654 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/266234f1-8683-4a0d-a1ec-42cd82184f11-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.585702 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzppv\" (UniqueName: \"kubernetes.io/projected/266234f1-8683-4a0d-a1ec-42cd82184f11-kube-api-access-rzppv\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.585744 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/266234f1-8683-4a0d-a1ec-42cd82184f11-ovn-rundir\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.585767 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/266234f1-8683-4a0d-a1ec-42cd82184f11-ovs-rundir\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.585809 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/266234f1-8683-4a0d-a1ec-42cd82184f11-config\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.662908 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-qg7p6"] Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.689201 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-ovsdbserver-nb\") pod \"dnsmasq-dns-65c78595c5-lglfk\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.689307 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/266234f1-8683-4a0d-a1ec-42cd82184f11-ovn-rundir\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.689338 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/266234f1-8683-4a0d-a1ec-42cd82184f11-ovs-rundir\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.689453 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvms\" (UniqueName: \"kubernetes.io/projected/44bc8d8d-817f-4f28-84ad-89693025013c-kube-api-access-hpvms\") pod \"dnsmasq-dns-65c78595c5-lglfk\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.689472 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-dns-svc\") pod \"dnsmasq-dns-65c78595c5-lglfk\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.689507 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/266234f1-8683-4a0d-a1ec-42cd82184f11-config\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.689535 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/266234f1-8683-4a0d-a1ec-42cd82184f11-combined-ca-bundle\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.689588 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-config\") pod \"dnsmasq-dns-65c78595c5-lglfk\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.689618 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/266234f1-8683-4a0d-a1ec-42cd82184f11-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.689663 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzppv\" (UniqueName: \"kubernetes.io/projected/266234f1-8683-4a0d-a1ec-42cd82184f11-kube-api-access-rzppv\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.690090 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/266234f1-8683-4a0d-a1ec-42cd82184f11-ovn-rundir\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.690150 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/266234f1-8683-4a0d-a1ec-42cd82184f11-ovs-rundir\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.691665 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/266234f1-8683-4a0d-a1ec-42cd82184f11-config\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.694989 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c7b6b5695-zlwtm"] Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.697926 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/266234f1-8683-4a0d-a1ec-42cd82184f11-combined-ca-bundle\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.699476 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/266234f1-8683-4a0d-a1ec-42cd82184f11-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.701849 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.709447 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.717865 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6b5695-zlwtm"] Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.765165 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzppv\" (UniqueName: \"kubernetes.io/projected/266234f1-8683-4a0d-a1ec-42cd82184f11-kube-api-access-rzppv\") pod \"ovn-controller-metrics-szpg5\" (UID: \"266234f1-8683-4a0d-a1ec-42cd82184f11\") " pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.792141 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvms\" (UniqueName: \"kubernetes.io/projected/44bc8d8d-817f-4f28-84ad-89693025013c-kube-api-access-hpvms\") pod \"dnsmasq-dns-65c78595c5-lglfk\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.792184 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-dns-svc\") pod \"dnsmasq-dns-65c78595c5-lglfk\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.792207 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.792276 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-config\") pod \"dnsmasq-dns-65c78595c5-lglfk\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.792311 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-dns-svc\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.792373 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.792410 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-ovsdbserver-nb\") pod \"dnsmasq-dns-65c78595c5-lglfk\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.792432 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-config\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.792464 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8wgh\" (UniqueName: \"kubernetes.io/projected/1fc0ef03-f282-4208-a9b0-62d963459644-kube-api-access-k8wgh\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.793611 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-dns-svc\") pod \"dnsmasq-dns-65c78595c5-lglfk\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.794164 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-config\") pod \"dnsmasq-dns-65c78595c5-lglfk\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.794716 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-ovsdbserver-nb\") pod \"dnsmasq-dns-65c78595c5-lglfk\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.819577 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvms\" (UniqueName: \"kubernetes.io/projected/44bc8d8d-817f-4f28-84ad-89693025013c-kube-api-access-hpvms\") pod \"dnsmasq-dns-65c78595c5-lglfk\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.881030 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.895393 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8wgh\" (UniqueName: \"kubernetes.io/projected/1fc0ef03-f282-4208-a9b0-62d963459644-kube-api-access-k8wgh\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.895469 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.895539 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-dns-svc\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.895589 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.895605 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-config\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.896631 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-config\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.896942 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-ovsdbserver-sb\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.897362 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-dns-svc\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.898531 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-ovsdbserver-nb\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:30 crc kubenswrapper[4693]: I1125 12:25:30.919481 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8wgh\" (UniqueName: \"kubernetes.io/projected/1fc0ef03-f282-4208-a9b0-62d963459644-kube-api-access-k8wgh\") pod \"dnsmasq-dns-5c7b6b5695-zlwtm\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:31 crc kubenswrapper[4693]: I1125 12:25:31.027139 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-szpg5" Nov 25 12:25:31 crc kubenswrapper[4693]: I1125 12:25:31.161096 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:31 crc kubenswrapper[4693]: I1125 12:25:31.607445 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:25:31 crc kubenswrapper[4693]: I1125 12:25:31.710785 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281ee04c-45b9-4b52-ace7-e5de8bcb2314-config\") pod \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\" (UID: \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\") " Nov 25 12:25:31 crc kubenswrapper[4693]: I1125 12:25:31.711187 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttr5x\" (UniqueName: \"kubernetes.io/projected/281ee04c-45b9-4b52-ace7-e5de8bcb2314-kube-api-access-ttr5x\") pod \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\" (UID: \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\") " Nov 25 12:25:31 crc kubenswrapper[4693]: I1125 12:25:31.711251 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281ee04c-45b9-4b52-ace7-e5de8bcb2314-dns-svc\") pod \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\" (UID: \"281ee04c-45b9-4b52-ace7-e5de8bcb2314\") " Nov 25 12:25:31 crc kubenswrapper[4693]: I1125 12:25:31.712455 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/281ee04c-45b9-4b52-ace7-e5de8bcb2314-config" (OuterVolumeSpecName: "config") pod "281ee04c-45b9-4b52-ace7-e5de8bcb2314" (UID: "281ee04c-45b9-4b52-ace7-e5de8bcb2314"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:31 crc kubenswrapper[4693]: I1125 12:25:31.712645 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/281ee04c-45b9-4b52-ace7-e5de8bcb2314-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "281ee04c-45b9-4b52-ace7-e5de8bcb2314" (UID: "281ee04c-45b9-4b52-ace7-e5de8bcb2314"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:31 crc kubenswrapper[4693]: I1125 12:25:31.718082 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/281ee04c-45b9-4b52-ace7-e5de8bcb2314-kube-api-access-ttr5x" (OuterVolumeSpecName: "kube-api-access-ttr5x") pod "281ee04c-45b9-4b52-ace7-e5de8bcb2314" (UID: "281ee04c-45b9-4b52-ace7-e5de8bcb2314"). InnerVolumeSpecName "kube-api-access-ttr5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:25:31 crc kubenswrapper[4693]: I1125 12:25:31.813717 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/281ee04c-45b9-4b52-ace7-e5de8bcb2314-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:31 crc kubenswrapper[4693]: I1125 12:25:31.813909 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttr5x\" (UniqueName: \"kubernetes.io/projected/281ee04c-45b9-4b52-ace7-e5de8bcb2314-kube-api-access-ttr5x\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:31 crc kubenswrapper[4693]: I1125 12:25:31.814122 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/281ee04c-45b9-4b52-ace7-e5de8bcb2314-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.517358 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" event={"ID":"8f86ef62-ce38-42c9-a0bf-667726b714cc","Type":"ContainerDied","Data":"5536230188e274d5700963fc534400241e22f91bbf706fe30591055e9637aa28"} Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.517419 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5536230188e274d5700963fc534400241e22f91bbf706fe30591055e9637aa28" Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.518870 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" event={"ID":"281ee04c-45b9-4b52-ace7-e5de8bcb2314","Type":"ContainerDied","Data":"ca8c9f2f7edc12aa519cd452fd30ab2ed2c0f4019a48a97bb8283d017f55534a"} Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.518924 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c6d9948dc-qg7p6" Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.553766 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.632419 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-qg7p6"] Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.638995 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c6d9948dc-qg7p6"] Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.729237 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f86ef62-ce38-42c9-a0bf-667726b714cc-dns-svc\") pod \"8f86ef62-ce38-42c9-a0bf-667726b714cc\" (UID: \"8f86ef62-ce38-42c9-a0bf-667726b714cc\") " Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.729365 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqcjt\" (UniqueName: \"kubernetes.io/projected/8f86ef62-ce38-42c9-a0bf-667726b714cc-kube-api-access-kqcjt\") pod \"8f86ef62-ce38-42c9-a0bf-667726b714cc\" (UID: \"8f86ef62-ce38-42c9-a0bf-667726b714cc\") " Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.729454 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f86ef62-ce38-42c9-a0bf-667726b714cc-config\") pod \"8f86ef62-ce38-42c9-a0bf-667726b714cc\" (UID: \"8f86ef62-ce38-42c9-a0bf-667726b714cc\") " Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.730237 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f86ef62-ce38-42c9-a0bf-667726b714cc-config" (OuterVolumeSpecName: "config") pod "8f86ef62-ce38-42c9-a0bf-667726b714cc" (UID: "8f86ef62-ce38-42c9-a0bf-667726b714cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.731856 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f86ef62-ce38-42c9-a0bf-667726b714cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8f86ef62-ce38-42c9-a0bf-667726b714cc" (UID: "8f86ef62-ce38-42c9-a0bf-667726b714cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.737338 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f86ef62-ce38-42c9-a0bf-667726b714cc-kube-api-access-kqcjt" (OuterVolumeSpecName: "kube-api-access-kqcjt") pod "8f86ef62-ce38-42c9-a0bf-667726b714cc" (UID: "8f86ef62-ce38-42c9-a0bf-667726b714cc"). InnerVolumeSpecName "kube-api-access-kqcjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.821541 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="281ee04c-45b9-4b52-ace7-e5de8bcb2314" path="/var/lib/kubelet/pods/281ee04c-45b9-4b52-ace7-e5de8bcb2314/volumes" Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.831282 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f86ef62-ce38-42c9-a0bf-667726b714cc-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.831308 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqcjt\" (UniqueName: \"kubernetes.io/projected/8f86ef62-ce38-42c9-a0bf-667726b714cc-kube-api-access-kqcjt\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.831318 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f86ef62-ce38-42c9-a0bf-667726b714cc-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:32 crc kubenswrapper[4693]: I1125 12:25:32.885585 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-szpg5"] Nov 25 12:25:33 crc kubenswrapper[4693]: I1125 12:25:33.525980 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6486446b9f-4lr9b" Nov 25 12:25:33 crc kubenswrapper[4693]: I1125 12:25:33.567371 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-4lr9b"] Nov 25 12:25:33 crc kubenswrapper[4693]: I1125 12:25:33.572654 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6486446b9f-4lr9b"] Nov 25 12:25:33 crc kubenswrapper[4693]: I1125 12:25:33.673021 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c7b6b5695-zlwtm"] Nov 25 12:25:33 crc kubenswrapper[4693]: W1125 12:25:33.735816 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc0ef03_f282_4208_a9b0_62d963459644.slice/crio-677f713fe54516bda4486e65122dafc16c3796c27a3fef2cfd886f077f133183 WatchSource:0}: Error finding container 677f713fe54516bda4486e65122dafc16c3796c27a3fef2cfd886f077f133183: Status 404 returned error can't find the container with id 677f713fe54516bda4486e65122dafc16c3796c27a3fef2cfd886f077f133183 Nov 25 12:25:34 crc kubenswrapper[4693]: I1125 12:25:34.014433 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c78595c5-lglfk"] Nov 25 12:25:34 crc kubenswrapper[4693]: W1125 12:25:34.154848 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44bc8d8d_817f_4f28_84ad_89693025013c.slice/crio-c2c0cf2a03d7ace5907a70313298b85d9e2325c29fd7ec9069fc678a7d1b44cc WatchSource:0}: Error finding container c2c0cf2a03d7ace5907a70313298b85d9e2325c29fd7ec9069fc678a7d1b44cc: Status 404 returned error can't find the container with id c2c0cf2a03d7ace5907a70313298b85d9e2325c29fd7ec9069fc678a7d1b44cc Nov 25 12:25:34 crc kubenswrapper[4693]: I1125 12:25:34.535464 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" event={"ID":"1fc0ef03-f282-4208-a9b0-62d963459644","Type":"ContainerStarted","Data":"677f713fe54516bda4486e65122dafc16c3796c27a3fef2cfd886f077f133183"} Nov 25 12:25:34 crc kubenswrapper[4693]: I1125 12:25:34.536938 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" event={"ID":"44bc8d8d-817f-4f28-84ad-89693025013c","Type":"ContainerStarted","Data":"c2c0cf2a03d7ace5907a70313298b85d9e2325c29fd7ec9069fc678a7d1b44cc"} Nov 25 12:25:34 crc kubenswrapper[4693]: I1125 12:25:34.538653 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-szpg5" event={"ID":"266234f1-8683-4a0d-a1ec-42cd82184f11","Type":"ContainerStarted","Data":"f558a8b977beb4dd0b6784f0a0d47f3ab14063a5f97117bb37ad8c12196b02bc"} Nov 25 12:25:34 crc kubenswrapper[4693]: I1125 12:25:34.825813 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f86ef62-ce38-42c9-a0bf-667726b714cc" path="/var/lib/kubelet/pods/8f86ef62-ce38-42c9-a0bf-667726b714cc/volumes" Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.550666 4693 generic.go:334] "Generic (PLEG): container finished" podID="1fc0ef03-f282-4208-a9b0-62d963459644" containerID="0d2b6a8178a0d5d6397f57cb55e816aa2621139e982363241fb8e21e7dbef3a8" exitCode=0 Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.550761 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" event={"ID":"1fc0ef03-f282-4208-a9b0-62d963459644","Type":"ContainerDied","Data":"0d2b6a8178a0d5d6397f57cb55e816aa2621139e982363241fb8e21e7dbef3a8"} Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.556483 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4d617274-42b9-4d07-b321-d70a5aeba8ee","Type":"ContainerStarted","Data":"ce5580f5480df6f70a080b4bbffc058da4f6f5956c5948838dfb701cadb19b21"} Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.569915 4693 generic.go:334] "Generic (PLEG): container finished" podID="93d2601b-fc82-478d-8667-dbce77606f4d" containerID="e0dae969e69a5cfca7ce52d9ff830fd98f55fa711f59a5dd83770f5a6c345e8c" exitCode=0 Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.570021 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8vhnn" event={"ID":"93d2601b-fc82-478d-8667-dbce77606f4d","Type":"ContainerDied","Data":"e0dae969e69a5cfca7ce52d9ff830fd98f55fa711f59a5dd83770f5a6c345e8c"} Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.580306 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"89a481b1-6040-4f15-a63f-d6d2301c3534","Type":"ContainerStarted","Data":"4bf7f42a1e3df4ea65a063a22e2d4d3032702eb83b1f26cf57b44b9f04f44a80"} Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.587113 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"459f5353-15bd-4139-a363-7a1bf6fe94cf","Type":"ContainerStarted","Data":"d6be5879931309b014c2d87fe5f10d2e0131576c0da970a33b497f0c97438b54"} Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.587296 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.590542 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"808a4dcd-a02a-4dd6-a797-5932896d3482","Type":"ContainerStarted","Data":"157be9351e9d0e6bca815b5ffd868d645f56c922a3229c669e3e9819e526beff"} Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.590804 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.610847 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9fc3b8be-d4cc-4bb4-86f0-5516294c1221","Type":"ContainerStarted","Data":"f1064c91a9a489703b806c9a071901521c660c16195bd3bc1e2305ff3e7d3f1d"} Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.627925 4693 generic.go:334] "Generic (PLEG): container finished" podID="44bc8d8d-817f-4f28-84ad-89693025013c" containerID="8cd2fd28edc00deb2dea7b2600543a6f40da8debfffcfcf14096b98228d20d69" exitCode=0 Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.628248 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" event={"ID":"44bc8d8d-817f-4f28-84ad-89693025013c","Type":"ContainerDied","Data":"8cd2fd28edc00deb2dea7b2600543a6f40da8debfffcfcf14096b98228d20d69"} Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.634635 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ndgsx" event={"ID":"96236f54-53d2-47df-854b-51addeda1dee","Type":"ContainerStarted","Data":"5e0d170c1192ad547cd87d5ac42a10c3a95678c5db4ffd6f2438c48af7489a88"} Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.636200 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ndgsx" Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.643537 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.60976807 podStartE2EDuration="32.643516793s" podCreationTimestamp="2025-11-25 12:25:03 +0000 UTC" firstStartedPulling="2025-11-25 12:25:26.597292691 +0000 UTC m=+1046.515378072" lastFinishedPulling="2025-11-25 12:25:34.631041414 +0000 UTC m=+1054.549126795" observedRunningTime="2025-11-25 12:25:35.623491304 +0000 UTC m=+1055.541576685" watchObservedRunningTime="2025-11-25 12:25:35.643516793 +0000 UTC m=+1055.561602174" Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.655322 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=27.030048564 podStartE2EDuration="34.655305247s" podCreationTimestamp="2025-11-25 12:25:01 +0000 UTC" firstStartedPulling="2025-11-25 12:25:26.100544037 +0000 UTC m=+1046.018629418" lastFinishedPulling="2025-11-25 12:25:33.72580072 +0000 UTC m=+1053.643886101" observedRunningTime="2025-11-25 12:25:35.639740685 +0000 UTC m=+1055.557826086" watchObservedRunningTime="2025-11-25 12:25:35.655305247 +0000 UTC m=+1055.573390628" Nov 25 12:25:35 crc kubenswrapper[4693]: I1125 12:25:35.708628 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ndgsx" podStartSLOduration=20.265386391 podStartE2EDuration="27.708608599s" podCreationTimestamp="2025-11-25 12:25:08 +0000 UTC" firstStartedPulling="2025-11-25 12:25:26.704648147 +0000 UTC m=+1046.622733528" lastFinishedPulling="2025-11-25 12:25:34.147870355 +0000 UTC m=+1054.065955736" observedRunningTime="2025-11-25 12:25:35.703783152 +0000 UTC m=+1055.621868533" watchObservedRunningTime="2025-11-25 12:25:35.708608599 +0000 UTC m=+1055.626693990" Nov 25 12:25:36 crc kubenswrapper[4693]: I1125 12:25:36.642984 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" event={"ID":"1fc0ef03-f282-4208-a9b0-62d963459644","Type":"ContainerStarted","Data":"15e9e019480709ea29b844b86364b88f8ec8cb94915ba7cb7f3bb5a088c44d9c"} Nov 25 12:25:36 crc kubenswrapper[4693]: I1125 12:25:36.643269 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:36 crc kubenswrapper[4693]: I1125 12:25:36.646428 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" event={"ID":"44bc8d8d-817f-4f28-84ad-89693025013c","Type":"ContainerStarted","Data":"55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce"} Nov 25 12:25:36 crc kubenswrapper[4693]: I1125 12:25:36.646556 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:36 crc kubenswrapper[4693]: I1125 12:25:36.648430 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8vhnn" event={"ID":"93d2601b-fc82-478d-8667-dbce77606f4d","Type":"ContainerStarted","Data":"25630765b3007559981992f25db1a36bf51f2b266a65d0ab21b671cef29cac60"} Nov 25 12:25:36 crc kubenswrapper[4693]: I1125 12:25:36.665982 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" podStartSLOduration=5.786705775 podStartE2EDuration="6.665963432s" podCreationTimestamp="2025-11-25 12:25:30 +0000 UTC" firstStartedPulling="2025-11-25 12:25:33.751784627 +0000 UTC m=+1053.669870008" lastFinishedPulling="2025-11-25 12:25:34.631042284 +0000 UTC m=+1054.549127665" observedRunningTime="2025-11-25 12:25:36.660321042 +0000 UTC m=+1056.578406423" watchObservedRunningTime="2025-11-25 12:25:36.665963432 +0000 UTC m=+1056.584048813" Nov 25 12:25:36 crc kubenswrapper[4693]: I1125 12:25:36.681202 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" podStartSLOduration=6.044993194 podStartE2EDuration="6.681181044s" podCreationTimestamp="2025-11-25 12:25:30 +0000 UTC" firstStartedPulling="2025-11-25 12:25:34.156848551 +0000 UTC m=+1054.074933932" lastFinishedPulling="2025-11-25 12:25:34.793036401 +0000 UTC m=+1054.711121782" observedRunningTime="2025-11-25 12:25:36.674182586 +0000 UTC m=+1056.592267977" watchObservedRunningTime="2025-11-25 12:25:36.681181044 +0000 UTC m=+1056.599266425" Nov 25 12:25:37 crc kubenswrapper[4693]: I1125 12:25:37.669219 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8vhnn" event={"ID":"93d2601b-fc82-478d-8667-dbce77606f4d","Type":"ContainerStarted","Data":"18278db5e661061b4cfa44448e2e93f70ee108c262d2acd4d3cbc24c71631e2f"} Nov 25 12:25:37 crc kubenswrapper[4693]: I1125 12:25:37.670348 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:37 crc kubenswrapper[4693]: I1125 12:25:37.670497 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:25:37 crc kubenswrapper[4693]: I1125 12:25:37.699822 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8vhnn" podStartSLOduration=23.024483544 podStartE2EDuration="29.699802935s" podCreationTimestamp="2025-11-25 12:25:08 +0000 UTC" firstStartedPulling="2025-11-25 12:25:26.880559568 +0000 UTC m=+1046.798644949" lastFinishedPulling="2025-11-25 12:25:33.555878959 +0000 UTC m=+1053.473964340" observedRunningTime="2025-11-25 12:25:37.696115681 +0000 UTC m=+1057.614201062" watchObservedRunningTime="2025-11-25 12:25:37.699802935 +0000 UTC m=+1057.617888316" Nov 25 12:25:38 crc kubenswrapper[4693]: I1125 12:25:38.679689 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-szpg5" event={"ID":"266234f1-8683-4a0d-a1ec-42cd82184f11","Type":"ContainerStarted","Data":"c3626df84884b757379409c8670fef8fdd90b050a5420afe481472a629f9087f"} Nov 25 12:25:38 crc kubenswrapper[4693]: I1125 12:25:38.685392 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"89a481b1-6040-4f15-a63f-d6d2301c3534","Type":"ContainerStarted","Data":"3873119ce83e7e0806fd5b8c25d7ba39ac25bf05695b5a21cb2b65fd59b60c23"} Nov 25 12:25:38 crc kubenswrapper[4693]: I1125 12:25:38.691157 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4d617274-42b9-4d07-b321-d70a5aeba8ee","Type":"ContainerStarted","Data":"e647d5532f5e022052776c6e9b07a1d7c28ca130db4fa48f011e0d0f84182d32"} Nov 25 12:25:38 crc kubenswrapper[4693]: I1125 12:25:38.693716 4693 generic.go:334] "Generic (PLEG): container finished" podID="9fc3b8be-d4cc-4bb4-86f0-5516294c1221" containerID="f1064c91a9a489703b806c9a071901521c660c16195bd3bc1e2305ff3e7d3f1d" exitCode=0 Nov 25 12:25:38 crc kubenswrapper[4693]: I1125 12:25:38.693804 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9fc3b8be-d4cc-4bb4-86f0-5516294c1221","Type":"ContainerDied","Data":"f1064c91a9a489703b806c9a071901521c660c16195bd3bc1e2305ff3e7d3f1d"} Nov 25 12:25:38 crc kubenswrapper[4693]: I1125 12:25:38.718903 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-szpg5" podStartSLOduration=4.808449219 podStartE2EDuration="8.71887998s" podCreationTimestamp="2025-11-25 12:25:30 +0000 UTC" firstStartedPulling="2025-11-25 12:25:33.640519541 +0000 UTC m=+1053.558604922" lastFinishedPulling="2025-11-25 12:25:37.550950302 +0000 UTC m=+1057.469035683" observedRunningTime="2025-11-25 12:25:38.713401945 +0000 UTC m=+1058.631487546" watchObservedRunningTime="2025-11-25 12:25:38.71887998 +0000 UTC m=+1058.636965361" Nov 25 12:25:38 crc kubenswrapper[4693]: I1125 12:25:38.741534 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=19.015235843 podStartE2EDuration="29.741511182s" podCreationTimestamp="2025-11-25 12:25:09 +0000 UTC" firstStartedPulling="2025-11-25 12:25:26.731284253 +0000 UTC m=+1046.649369634" lastFinishedPulling="2025-11-25 12:25:37.457559572 +0000 UTC m=+1057.375644973" observedRunningTime="2025-11-25 12:25:38.739363792 +0000 UTC m=+1058.657449173" watchObservedRunningTime="2025-11-25 12:25:38.741511182 +0000 UTC m=+1058.659596563" Nov 25 12:25:38 crc kubenswrapper[4693]: I1125 12:25:38.787528 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=21.202645991 podStartE2EDuration="31.787460076s" podCreationTimestamp="2025-11-25 12:25:07 +0000 UTC" firstStartedPulling="2025-11-25 12:25:26.953049325 +0000 UTC m=+1046.871134706" lastFinishedPulling="2025-11-25 12:25:37.5378634 +0000 UTC m=+1057.455948791" observedRunningTime="2025-11-25 12:25:38.784210064 +0000 UTC m=+1058.702295445" watchObservedRunningTime="2025-11-25 12:25:38.787460076 +0000 UTC m=+1058.705545487" Nov 25 12:25:39 crc kubenswrapper[4693]: I1125 12:25:39.310612 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:39 crc kubenswrapper[4693]: I1125 12:25:39.310829 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:39 crc kubenswrapper[4693]: I1125 12:25:39.345122 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:39 crc kubenswrapper[4693]: I1125 12:25:39.714274 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"9fc3b8be-d4cc-4bb4-86f0-5516294c1221","Type":"ContainerStarted","Data":"c462362c58cd235a4e7057994ba27a9d706b22b2d337002186f85c20041af4d8"} Nov 25 12:25:39 crc kubenswrapper[4693]: I1125 12:25:39.753350 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=32.476897046 podStartE2EDuration="39.753323581s" podCreationTimestamp="2025-11-25 12:25:00 +0000 UTC" firstStartedPulling="2025-11-25 12:25:26.709277019 +0000 UTC m=+1046.627362400" lastFinishedPulling="2025-11-25 12:25:33.985703544 +0000 UTC m=+1053.903788935" observedRunningTime="2025-11-25 12:25:39.751911801 +0000 UTC m=+1059.669997282" watchObservedRunningTime="2025-11-25 12:25:39.753323581 +0000 UTC m=+1059.671409002" Nov 25 12:25:39 crc kubenswrapper[4693]: I1125 12:25:39.791671 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 25 12:25:40 crc kubenswrapper[4693]: I1125 12:25:40.884006 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:40 crc kubenswrapper[4693]: I1125 12:25:40.925121 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:40 crc kubenswrapper[4693]: I1125 12:25:40.925178 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:40 crc kubenswrapper[4693]: I1125 12:25:40.983659 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.163269 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.224233 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c78595c5-lglfk"] Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.732198 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd","Type":"ContainerStarted","Data":"aaea4b603a4bfe2f3d721364f224b43ef2203ecbff282e601ec66184fcc4609c"} Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.733535 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" podUID="44bc8d8d-817f-4f28-84ad-89693025013c" containerName="dnsmasq-dns" containerID="cri-o://55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce" gracePeriod=10 Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.781095 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.946319 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.947633 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.958927 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.958997 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.958997 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qcnb8" Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.959173 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.975708 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.997228 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:41 crc kubenswrapper[4693]: I1125 12:25:41.997751 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.093870 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-scripts\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.094220 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.094252 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.094289 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-config\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.094313 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.094342 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.094408 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wffl8\" (UniqueName: \"kubernetes.io/projected/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-kube-api-access-wffl8\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.182290 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.195932 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-scripts\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.195994 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.196027 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.196051 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-config\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.196068 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.196088 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.196107 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wffl8\" (UniqueName: \"kubernetes.io/projected/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-kube-api-access-wffl8\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.197505 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-scripts\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.197615 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-config\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.198102 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.203711 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.206275 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.208165 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.222211 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wffl8\" (UniqueName: \"kubernetes.io/projected/5de2ed94-055d-4e4b-b069-3bcafd88cc3f-kube-api-access-wffl8\") pod \"ovn-northd-0\" (UID: \"5de2ed94-055d-4e4b-b069-3bcafd88cc3f\") " pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.264002 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.293240 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.399986 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-dns-svc\") pod \"44bc8d8d-817f-4f28-84ad-89693025013c\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.400483 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-config\") pod \"44bc8d8d-817f-4f28-84ad-89693025013c\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.400522 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpvms\" (UniqueName: \"kubernetes.io/projected/44bc8d8d-817f-4f28-84ad-89693025013c-kube-api-access-hpvms\") pod \"44bc8d8d-817f-4f28-84ad-89693025013c\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.400541 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-ovsdbserver-nb\") pod \"44bc8d8d-817f-4f28-84ad-89693025013c\" (UID: \"44bc8d8d-817f-4f28-84ad-89693025013c\") " Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.405295 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44bc8d8d-817f-4f28-84ad-89693025013c-kube-api-access-hpvms" (OuterVolumeSpecName: "kube-api-access-hpvms") pod "44bc8d8d-817f-4f28-84ad-89693025013c" (UID: "44bc8d8d-817f-4f28-84ad-89693025013c"). InnerVolumeSpecName "kube-api-access-hpvms". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.453625 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-config" (OuterVolumeSpecName: "config") pod "44bc8d8d-817f-4f28-84ad-89693025013c" (UID: "44bc8d8d-817f-4f28-84ad-89693025013c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.455886 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "44bc8d8d-817f-4f28-84ad-89693025013c" (UID: "44bc8d8d-817f-4f28-84ad-89693025013c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.457798 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "44bc8d8d-817f-4f28-84ad-89693025013c" (UID: "44bc8d8d-817f-4f28-84ad-89693025013c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.502389 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.502422 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpvms\" (UniqueName: \"kubernetes.io/projected/44bc8d8d-817f-4f28-84ad-89693025013c-kube-api-access-hpvms\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.502437 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.502450 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44bc8d8d-817f-4f28-84ad-89693025013c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.703746 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.739335 4693 generic.go:334] "Generic (PLEG): container finished" podID="44bc8d8d-817f-4f28-84ad-89693025013c" containerID="55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce" exitCode=0 Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.739399 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" event={"ID":"44bc8d8d-817f-4f28-84ad-89693025013c","Type":"ContainerDied","Data":"55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce"} Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.739420 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.739441 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c78595c5-lglfk" event={"ID":"44bc8d8d-817f-4f28-84ad-89693025013c","Type":"ContainerDied","Data":"c2c0cf2a03d7ace5907a70313298b85d9e2325c29fd7ec9069fc678a7d1b44cc"} Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.739457 4693 scope.go:117] "RemoveContainer" containerID="55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.741437 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5de2ed94-055d-4e4b-b069-3bcafd88cc3f","Type":"ContainerStarted","Data":"2027c40474d95dbe58a39197fcc46dcbfbd959024d3c97c7bbe6d1884f33d527"} Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.770244 4693 scope.go:117] "RemoveContainer" containerID="8cd2fd28edc00deb2dea7b2600543a6f40da8debfffcfcf14096b98228d20d69" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.776474 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c78595c5-lglfk"] Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.783751 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65c78595c5-lglfk"] Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.810195 4693 scope.go:117] "RemoveContainer" containerID="55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce" Nov 25 12:25:42 crc kubenswrapper[4693]: E1125 12:25:42.810735 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce\": container with ID starting with 55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce not found: ID does not exist" containerID="55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.810816 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce"} err="failed to get container status \"55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce\": rpc error: code = NotFound desc = could not find container \"55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce\": container with ID starting with 55a30afca2b6ce2e1d816e93e0149373774fbfa537f544c8565edc81429ba0ce not found: ID does not exist" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.810869 4693 scope.go:117] "RemoveContainer" containerID="8cd2fd28edc00deb2dea7b2600543a6f40da8debfffcfcf14096b98228d20d69" Nov 25 12:25:42 crc kubenswrapper[4693]: E1125 12:25:42.811338 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cd2fd28edc00deb2dea7b2600543a6f40da8debfffcfcf14096b98228d20d69\": container with ID starting with 8cd2fd28edc00deb2dea7b2600543a6f40da8debfffcfcf14096b98228d20d69 not found: ID does not exist" containerID="8cd2fd28edc00deb2dea7b2600543a6f40da8debfffcfcf14096b98228d20d69" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.811423 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cd2fd28edc00deb2dea7b2600543a6f40da8debfffcfcf14096b98228d20d69"} err="failed to get container status \"8cd2fd28edc00deb2dea7b2600543a6f40da8debfffcfcf14096b98228d20d69\": rpc error: code = NotFound desc = could not find container \"8cd2fd28edc00deb2dea7b2600543a6f40da8debfffcfcf14096b98228d20d69\": container with ID starting with 8cd2fd28edc00deb2dea7b2600543a6f40da8debfffcfcf14096b98228d20d69 not found: ID does not exist" Nov 25 12:25:42 crc kubenswrapper[4693]: I1125 12:25:42.826263 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44bc8d8d-817f-4f28-84ad-89693025013c" path="/var/lib/kubelet/pods/44bc8d8d-817f-4f28-84ad-89693025013c/volumes" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.015095 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf8bcbfcf-pwdqc"] Nov 25 12:25:44 crc kubenswrapper[4693]: E1125 12:25:44.029407 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bc8d8d-817f-4f28-84ad-89693025013c" containerName="init" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.029446 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bc8d8d-817f-4f28-84ad-89693025013c" containerName="init" Nov 25 12:25:44 crc kubenswrapper[4693]: E1125 12:25:44.029486 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bc8d8d-817f-4f28-84ad-89693025013c" containerName="dnsmasq-dns" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.029499 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bc8d8d-817f-4f28-84ad-89693025013c" containerName="dnsmasq-dns" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.041117 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="44bc8d8d-817f-4f28-84ad-89693025013c" containerName="dnsmasq-dns" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.043184 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.043523 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.061054 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf8bcbfcf-pwdqc"] Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.129773 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-ovsdbserver-nb\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.129880 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-dns-svc\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.129926 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pnhk\" (UniqueName: \"kubernetes.io/projected/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-kube-api-access-5pnhk\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.130007 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-config\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.130031 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-ovsdbserver-sb\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.232771 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pnhk\" (UniqueName: \"kubernetes.io/projected/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-kube-api-access-5pnhk\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.232834 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-config\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.232863 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-ovsdbserver-sb\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.232967 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-ovsdbserver-nb\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.232996 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-dns-svc\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.234008 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-dns-svc\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.234525 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-config\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.235046 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-ovsdbserver-nb\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.235541 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-ovsdbserver-sb\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.256729 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pnhk\" (UniqueName: \"kubernetes.io/projected/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-kube-api-access-5pnhk\") pod \"dnsmasq-dns-cf8bcbfcf-pwdqc\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.385500 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.861689 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf8bcbfcf-pwdqc"] Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.876258 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:44 crc kubenswrapper[4693]: I1125 12:25:44.950812 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.162516 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.167902 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.169663 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lbns7" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.169892 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.170010 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.171226 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.184953 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.248332 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr76n\" (UniqueName: \"kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-kube-api-access-cr76n\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.248404 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.248434 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.248454 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c8b28a97-55d7-41b0-aa09-55e4e132bd64-cache\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.248554 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c8b28a97-55d7-41b0-aa09-55e4e132bd64-lock\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.350343 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c8b28a97-55d7-41b0-aa09-55e4e132bd64-lock\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.350715 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr76n\" (UniqueName: \"kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-kube-api-access-cr76n\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.350745 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.350769 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.350789 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c8b28a97-55d7-41b0-aa09-55e4e132bd64-cache\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.350916 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c8b28a97-55d7-41b0-aa09-55e4e132bd64-lock\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: E1125 12:25:45.351059 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 25 12:25:45 crc kubenswrapper[4693]: E1125 12:25:45.351088 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.351129 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: E1125 12:25:45.351147 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift podName:c8b28a97-55d7-41b0-aa09-55e4e132bd64 nodeName:}" failed. No retries permitted until 2025-11-25 12:25:45.851126289 +0000 UTC m=+1065.769211670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift") pod "swift-storage-0" (UID: "c8b28a97-55d7-41b0-aa09-55e4e132bd64") : configmap "swift-ring-files" not found Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.351213 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c8b28a97-55d7-41b0-aa09-55e4e132bd64-cache\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.369023 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr76n\" (UniqueName: \"kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-kube-api-access-cr76n\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.376196 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.694711 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-5hdw5"] Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.695904 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.697912 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.698188 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.704692 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.721758 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-5hdw5"] Nov 25 12:25:45 crc kubenswrapper[4693]: E1125 12:25:45.722435 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-vrwcf ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-5hdw5" podUID="acc12489-0819-40eb-9c94-cd16af99ccbd" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.732347 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2kzct"] Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.733511 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.737661 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-5hdw5"] Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.746801 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2kzct"] Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.756896 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-combined-ca-bundle\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.760250 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acc12489-0819-40eb-9c94-cd16af99ccbd-scripts\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.760289 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/acc12489-0819-40eb-9c94-cd16af99ccbd-ring-data-devices\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.760329 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrwcf\" (UniqueName: \"kubernetes.io/projected/acc12489-0819-40eb-9c94-cd16af99ccbd-kube-api-access-vrwcf\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.760398 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-dispersionconf\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.760519 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/acc12489-0819-40eb-9c94-cd16af99ccbd-etc-swift\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.760568 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-swiftconf\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.765674 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" event={"ID":"4decc5db-72bf-4967-b2dd-6f843f9fb3ce","Type":"ContainerStarted","Data":"a9973023fc1425d422e338854564bb3bb495e093c37e20d9d5b36e71f6924fed"} Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.765706 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" event={"ID":"4decc5db-72bf-4967-b2dd-6f843f9fb3ce","Type":"ContainerStarted","Data":"ef065f49dcc4dc2f027c68898e9f67e61c584283253a0c0201f9fda3060f8708"} Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.766985 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5de2ed94-055d-4e4b-b069-3bcafd88cc3f","Type":"ContainerStarted","Data":"aa043a18eb351b06d2bc5f91bd5d61c729118fc6be5f1513c68add755af5b4f0"} Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.768135 4693 generic.go:334] "Generic (PLEG): container finished" podID="4cf2be5d-1c6c-402f-bf93-e9653a6a84cd" containerID="aaea4b603a4bfe2f3d721364f224b43ef2203ecbff282e601ec66184fcc4609c" exitCode=0 Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.768218 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.768604 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd","Type":"ContainerDied","Data":"aaea4b603a4bfe2f3d721364f224b43ef2203ecbff282e601ec66184fcc4609c"} Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862181 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-dispersionconf\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862227 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/acc12489-0819-40eb-9c94-cd16af99ccbd-etc-swift\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862273 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-swiftconf\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862293 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-combined-ca-bundle\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862346 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbcml\" (UniqueName: \"kubernetes.io/projected/88ff5ba0-ea04-4e77-9f16-05711082df93-kube-api-access-fbcml\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862385 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-swiftconf\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862442 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-combined-ca-bundle\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862477 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88ff5ba0-ea04-4e77-9f16-05711082df93-scripts\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862603 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862630 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acc12489-0819-40eb-9c94-cd16af99ccbd-scripts\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862658 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/88ff5ba0-ea04-4e77-9f16-05711082df93-ring-data-devices\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862687 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/acc12489-0819-40eb-9c94-cd16af99ccbd-ring-data-devices\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862731 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrwcf\" (UniqueName: \"kubernetes.io/projected/acc12489-0819-40eb-9c94-cd16af99ccbd-kube-api-access-vrwcf\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862780 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-dispersionconf\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.862856 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/88ff5ba0-ea04-4e77-9f16-05711082df93-etc-swift\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: E1125 12:25:45.864428 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 25 12:25:45 crc kubenswrapper[4693]: E1125 12:25:45.864463 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 25 12:25:45 crc kubenswrapper[4693]: E1125 12:25:45.864513 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift podName:c8b28a97-55d7-41b0-aa09-55e4e132bd64 nodeName:}" failed. No retries permitted until 2025-11-25 12:25:46.864494214 +0000 UTC m=+1066.782579595 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift") pod "swift-storage-0" (UID: "c8b28a97-55d7-41b0-aa09-55e4e132bd64") : configmap "swift-ring-files" not found Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.864521 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/acc12489-0819-40eb-9c94-cd16af99ccbd-etc-swift\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.865785 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acc12489-0819-40eb-9c94-cd16af99ccbd-scripts\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.865802 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/acc12489-0819-40eb-9c94-cd16af99ccbd-ring-data-devices\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.868354 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-dispersionconf\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.870075 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-swiftconf\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.871416 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-combined-ca-bundle\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.882801 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrwcf\" (UniqueName: \"kubernetes.io/projected/acc12489-0819-40eb-9c94-cd16af99ccbd-kube-api-access-vrwcf\") pod \"swift-ring-rebalance-5hdw5\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.883546 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.963637 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-dispersionconf\") pod \"acc12489-0819-40eb-9c94-cd16af99ccbd\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.963814 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrwcf\" (UniqueName: \"kubernetes.io/projected/acc12489-0819-40eb-9c94-cd16af99ccbd-kube-api-access-vrwcf\") pod \"acc12489-0819-40eb-9c94-cd16af99ccbd\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.963853 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-combined-ca-bundle\") pod \"acc12489-0819-40eb-9c94-cd16af99ccbd\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.964633 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acc12489-0819-40eb-9c94-cd16af99ccbd-scripts\") pod \"acc12489-0819-40eb-9c94-cd16af99ccbd\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.964697 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/acc12489-0819-40eb-9c94-cd16af99ccbd-ring-data-devices\") pod \"acc12489-0819-40eb-9c94-cd16af99ccbd\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.964840 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/acc12489-0819-40eb-9c94-cd16af99ccbd-etc-swift\") pod \"acc12489-0819-40eb-9c94-cd16af99ccbd\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.964891 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-swiftconf\") pod \"acc12489-0819-40eb-9c94-cd16af99ccbd\" (UID: \"acc12489-0819-40eb-9c94-cd16af99ccbd\") " Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965015 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acc12489-0819-40eb-9c94-cd16af99ccbd-scripts" (OuterVolumeSpecName: "scripts") pod "acc12489-0819-40eb-9c94-cd16af99ccbd" (UID: "acc12489-0819-40eb-9c94-cd16af99ccbd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965048 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbcml\" (UniqueName: \"kubernetes.io/projected/88ff5ba0-ea04-4e77-9f16-05711082df93-kube-api-access-fbcml\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965071 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-swiftconf\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965110 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88ff5ba0-ea04-4e77-9f16-05711082df93-scripts\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965144 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acc12489-0819-40eb-9c94-cd16af99ccbd-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "acc12489-0819-40eb-9c94-cd16af99ccbd" (UID: "acc12489-0819-40eb-9c94-cd16af99ccbd"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965198 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/88ff5ba0-ea04-4e77-9f16-05711082df93-ring-data-devices\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965252 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/88ff5ba0-ea04-4e77-9f16-05711082df93-etc-swift\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965277 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-dispersionconf\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965308 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-combined-ca-bundle\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965357 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acc12489-0819-40eb-9c94-cd16af99ccbd-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965379 4693 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/acc12489-0819-40eb-9c94-cd16af99ccbd-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965621 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc12489-0819-40eb-9c94-cd16af99ccbd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "acc12489-0819-40eb-9c94-cd16af99ccbd" (UID: "acc12489-0819-40eb-9c94-cd16af99ccbd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.965946 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/88ff5ba0-ea04-4e77-9f16-05711082df93-etc-swift\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.966333 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88ff5ba0-ea04-4e77-9f16-05711082df93-scripts\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.966444 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/88ff5ba0-ea04-4e77-9f16-05711082df93-ring-data-devices\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.967993 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "acc12489-0819-40eb-9c94-cd16af99ccbd" (UID: "acc12489-0819-40eb-9c94-cd16af99ccbd"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.969390 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "acc12489-0819-40eb-9c94-cd16af99ccbd" (UID: "acc12489-0819-40eb-9c94-cd16af99ccbd"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.971310 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acc12489-0819-40eb-9c94-cd16af99ccbd" (UID: "acc12489-0819-40eb-9c94-cd16af99ccbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.971682 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-dispersionconf\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.972215 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-swiftconf\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.972488 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-combined-ca-bundle\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.972758 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acc12489-0819-40eb-9c94-cd16af99ccbd-kube-api-access-vrwcf" (OuterVolumeSpecName: "kube-api-access-vrwcf") pod "acc12489-0819-40eb-9c94-cd16af99ccbd" (UID: "acc12489-0819-40eb-9c94-cd16af99ccbd"). InnerVolumeSpecName "kube-api-access-vrwcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:25:45 crc kubenswrapper[4693]: I1125 12:25:45.991891 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbcml\" (UniqueName: \"kubernetes.io/projected/88ff5ba0-ea04-4e77-9f16-05711082df93-kube-api-access-fbcml\") pod \"swift-ring-rebalance-2kzct\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.066874 4693 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/acc12489-0819-40eb-9c94-cd16af99ccbd-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.066902 4693 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.066912 4693 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.066921 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrwcf\" (UniqueName: \"kubernetes.io/projected/acc12489-0819-40eb-9c94-cd16af99ccbd-kube-api-access-vrwcf\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.066931 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc12489-0819-40eb-9c94-cd16af99ccbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.182868 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.633187 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2kzct"] Nov 25 12:25:46 crc kubenswrapper[4693]: W1125 12:25:46.639573 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ff5ba0_ea04_4e77_9f16_05711082df93.slice/crio-49660b86c19a4db186f0439fde95ea90ddc8c195a1ec834b6d6ad4052b7c9a97 WatchSource:0}: Error finding container 49660b86c19a4db186f0439fde95ea90ddc8c195a1ec834b6d6ad4052b7c9a97: Status 404 returned error can't find the container with id 49660b86c19a4db186f0439fde95ea90ddc8c195a1ec834b6d6ad4052b7c9a97 Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.780823 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5de2ed94-055d-4e4b-b069-3bcafd88cc3f","Type":"ContainerStarted","Data":"be7238e9b0acc3914d720c035697eca86cbaa2a75725eb10920a785d0384d70e"} Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.781014 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.786159 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4cf2be5d-1c6c-402f-bf93-e9653a6a84cd","Type":"ContainerStarted","Data":"c85bb95ebe043dcf59557dcc87f0cbde5280444656583b924d1cfa7bd80aca2e"} Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.788242 4693 generic.go:334] "Generic (PLEG): container finished" podID="4decc5db-72bf-4967-b2dd-6f843f9fb3ce" containerID="a9973023fc1425d422e338854564bb3bb495e093c37e20d9d5b36e71f6924fed" exitCode=0 Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.788302 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" event={"ID":"4decc5db-72bf-4967-b2dd-6f843f9fb3ce","Type":"ContainerDied","Data":"a9973023fc1425d422e338854564bb3bb495e093c37e20d9d5b36e71f6924fed"} Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.791468 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2kzct" event={"ID":"88ff5ba0-ea04-4e77-9f16-05711082df93","Type":"ContainerStarted","Data":"49660b86c19a4db186f0439fde95ea90ddc8c195a1ec834b6d6ad4052b7c9a97"} Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.791522 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5hdw5" Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.802189 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.309494774 podStartE2EDuration="5.802171549s" podCreationTimestamp="2025-11-25 12:25:41 +0000 UTC" firstStartedPulling="2025-11-25 12:25:42.716390032 +0000 UTC m=+1062.634475413" lastFinishedPulling="2025-11-25 12:25:45.209066807 +0000 UTC m=+1065.127152188" observedRunningTime="2025-11-25 12:25:46.799860183 +0000 UTC m=+1066.717945564" watchObservedRunningTime="2025-11-25 12:25:46.802171549 +0000 UTC m=+1066.720256930" Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.846748 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371989.008045 podStartE2EDuration="47.846730334s" podCreationTimestamp="2025-11-25 12:24:59 +0000 UTC" firstStartedPulling="2025-11-25 12:25:01.526244727 +0000 UTC m=+1021.444330098" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:25:46.837053759 +0000 UTC m=+1066.755139140" watchObservedRunningTime="2025-11-25 12:25:46.846730334 +0000 UTC m=+1066.764815715" Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.879425 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:46 crc kubenswrapper[4693]: E1125 12:25:46.881130 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 25 12:25:46 crc kubenswrapper[4693]: E1125 12:25:46.881157 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 25 12:25:46 crc kubenswrapper[4693]: E1125 12:25:46.881202 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift podName:c8b28a97-55d7-41b0-aa09-55e4e132bd64 nodeName:}" failed. No retries permitted until 2025-11-25 12:25:48.881185112 +0000 UTC m=+1068.799270493 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift") pod "swift-storage-0" (UID: "c8b28a97-55d7-41b0-aa09-55e4e132bd64") : configmap "swift-ring-files" not found Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.944618 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-5hdw5"] Nov 25 12:25:46 crc kubenswrapper[4693]: I1125 12:25:46.951602 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-5hdw5"] Nov 25 12:25:48 crc kubenswrapper[4693]: I1125 12:25:48.805595 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" event={"ID":"4decc5db-72bf-4967-b2dd-6f843f9fb3ce","Type":"ContainerStarted","Data":"00a0e5266a15e605189392cd68b00a1bdb5aa566f7412a99d381fb13ef268035"} Nov 25 12:25:48 crc kubenswrapper[4693]: I1125 12:25:48.806125 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:48 crc kubenswrapper[4693]: I1125 12:25:48.823727 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acc12489-0819-40eb-9c94-cd16af99ccbd" path="/var/lib/kubelet/pods/acc12489-0819-40eb-9c94-cd16af99ccbd/volumes" Nov 25 12:25:48 crc kubenswrapper[4693]: I1125 12:25:48.827509 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" podStartSLOduration=5.827498595 podStartE2EDuration="5.827498595s" podCreationTimestamp="2025-11-25 12:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:25:48.820497746 +0000 UTC m=+1068.738583147" watchObservedRunningTime="2025-11-25 12:25:48.827498595 +0000 UTC m=+1068.745583976" Nov 25 12:25:48 crc kubenswrapper[4693]: I1125 12:25:48.914068 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:48 crc kubenswrapper[4693]: E1125 12:25:48.914253 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 25 12:25:48 crc kubenswrapper[4693]: E1125 12:25:48.914278 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 25 12:25:48 crc kubenswrapper[4693]: E1125 12:25:48.914336 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift podName:c8b28a97-55d7-41b0-aa09-55e4e132bd64 nodeName:}" failed. No retries permitted until 2025-11-25 12:25:52.914316828 +0000 UTC m=+1072.832402209 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift") pod "swift-storage-0" (UID: "c8b28a97-55d7-41b0-aa09-55e4e132bd64") : configmap "swift-ring-files" not found Nov 25 12:25:50 crc kubenswrapper[4693]: I1125 12:25:50.897630 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 25 12:25:50 crc kubenswrapper[4693]: I1125 12:25:50.897948 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 25 12:25:52 crc kubenswrapper[4693]: I1125 12:25:52.978464 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:25:52 crc kubenswrapper[4693]: E1125 12:25:52.978691 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 25 12:25:52 crc kubenswrapper[4693]: E1125 12:25:52.978847 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 25 12:25:52 crc kubenswrapper[4693]: E1125 12:25:52.978908 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift podName:c8b28a97-55d7-41b0-aa09-55e4e132bd64 nodeName:}" failed. No retries permitted until 2025-11-25 12:26:00.978889273 +0000 UTC m=+1080.896974654 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift") pod "swift-storage-0" (UID: "c8b28a97-55d7-41b0-aa09-55e4e132bd64") : configmap "swift-ring-files" not found Nov 25 12:25:54 crc kubenswrapper[4693]: I1125 12:25:54.386763 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:25:54 crc kubenswrapper[4693]: I1125 12:25:54.495294 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6b5695-zlwtm"] Nov 25 12:25:54 crc kubenswrapper[4693]: I1125 12:25:54.495957 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" podUID="1fc0ef03-f282-4208-a9b0-62d963459644" containerName="dnsmasq-dns" containerID="cri-o://15e9e019480709ea29b844b86364b88f8ec8cb94915ba7cb7f3bb5a088c44d9c" gracePeriod=10 Nov 25 12:25:54 crc kubenswrapper[4693]: I1125 12:25:54.871006 4693 generic.go:334] "Generic (PLEG): container finished" podID="1fc0ef03-f282-4208-a9b0-62d963459644" containerID="15e9e019480709ea29b844b86364b88f8ec8cb94915ba7cb7f3bb5a088c44d9c" exitCode=0 Nov 25 12:25:54 crc kubenswrapper[4693]: I1125 12:25:54.871092 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" event={"ID":"1fc0ef03-f282-4208-a9b0-62d963459644","Type":"ContainerDied","Data":"15e9e019480709ea29b844b86364b88f8ec8cb94915ba7cb7f3bb5a088c44d9c"} Nov 25 12:25:55 crc kubenswrapper[4693]: E1125 12:25:55.902630 4693 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.136:52572->38.102.83.136:42645: read tcp 38.102.83.136:52572->38.102.83.136:42645: read: connection reset by peer Nov 25 12:25:57 crc kubenswrapper[4693]: I1125 12:25:57.324668 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 25 12:25:57 crc kubenswrapper[4693]: I1125 12:25:57.867520 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:57 crc kubenswrapper[4693]: I1125 12:25:57.899558 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" event={"ID":"1fc0ef03-f282-4208-a9b0-62d963459644","Type":"ContainerDied","Data":"677f713fe54516bda4486e65122dafc16c3796c27a3fef2cfd886f077f133183"} Nov 25 12:25:57 crc kubenswrapper[4693]: I1125 12:25:57.899616 4693 scope.go:117] "RemoveContainer" containerID="15e9e019480709ea29b844b86364b88f8ec8cb94915ba7cb7f3bb5a088c44d9c" Nov 25 12:25:57 crc kubenswrapper[4693]: I1125 12:25:57.899983 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" Nov 25 12:25:57 crc kubenswrapper[4693]: I1125 12:25:57.974826 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-dns-svc\") pod \"1fc0ef03-f282-4208-a9b0-62d963459644\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " Nov 25 12:25:57 crc kubenswrapper[4693]: I1125 12:25:57.974879 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8wgh\" (UniqueName: \"kubernetes.io/projected/1fc0ef03-f282-4208-a9b0-62d963459644-kube-api-access-k8wgh\") pod \"1fc0ef03-f282-4208-a9b0-62d963459644\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " Nov 25 12:25:57 crc kubenswrapper[4693]: I1125 12:25:57.974903 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-config\") pod \"1fc0ef03-f282-4208-a9b0-62d963459644\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " Nov 25 12:25:57 crc kubenswrapper[4693]: I1125 12:25:57.975027 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-ovsdbserver-sb\") pod \"1fc0ef03-f282-4208-a9b0-62d963459644\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " Nov 25 12:25:57 crc kubenswrapper[4693]: I1125 12:25:57.975073 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-ovsdbserver-nb\") pod \"1fc0ef03-f282-4208-a9b0-62d963459644\" (UID: \"1fc0ef03-f282-4208-a9b0-62d963459644\") " Nov 25 12:25:57 crc kubenswrapper[4693]: I1125 12:25:57.993607 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc0ef03-f282-4208-a9b0-62d963459644-kube-api-access-k8wgh" (OuterVolumeSpecName: "kube-api-access-k8wgh") pod "1fc0ef03-f282-4208-a9b0-62d963459644" (UID: "1fc0ef03-f282-4208-a9b0-62d963459644"). InnerVolumeSpecName "kube-api-access-k8wgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.062555 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fc0ef03-f282-4208-a9b0-62d963459644" (UID: "1fc0ef03-f282-4208-a9b0-62d963459644"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.062836 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fc0ef03-f282-4208-a9b0-62d963459644" (UID: "1fc0ef03-f282-4208-a9b0-62d963459644"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.065525 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fc0ef03-f282-4208-a9b0-62d963459644" (UID: "1fc0ef03-f282-4208-a9b0-62d963459644"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.077679 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.077717 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.077739 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.077752 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8wgh\" (UniqueName: \"kubernetes.io/projected/1fc0ef03-f282-4208-a9b0-62d963459644-kube-api-access-k8wgh\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.089110 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-config" (OuterVolumeSpecName: "config") pod "1fc0ef03-f282-4208-a9b0-62d963459644" (UID: "1fc0ef03-f282-4208-a9b0-62d963459644"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.179782 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc0ef03-f282-4208-a9b0-62d963459644-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.235433 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c7b6b5695-zlwtm"] Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.243450 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c7b6b5695-zlwtm"] Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.247115 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.464980 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 25 12:25:58 crc kubenswrapper[4693]: I1125 12:25:58.831146 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc0ef03-f282-4208-a9b0-62d963459644" path="/var/lib/kubelet/pods/1fc0ef03-f282-4208-a9b0-62d963459644/volumes" Nov 25 12:25:59 crc kubenswrapper[4693]: I1125 12:25:59.821201 4693 scope.go:117] "RemoveContainer" containerID="0d2b6a8178a0d5d6397f57cb55e816aa2621139e982363241fb8e21e7dbef3a8" Nov 25 12:25:59 crc kubenswrapper[4693]: I1125 12:25:59.913520 4693 generic.go:334] "Generic (PLEG): container finished" podID="4cd38986-be2a-4adf-b594-352740498acd" containerID="dd269fd355a10e4218bec3961a1dc96d8230751cf324653c82a3fbf6e7cc6e89" exitCode=0 Nov 25 12:25:59 crc kubenswrapper[4693]: I1125 12:25:59.913604 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4cd38986-be2a-4adf-b594-352740498acd","Type":"ContainerDied","Data":"dd269fd355a10e4218bec3961a1dc96d8230751cf324653c82a3fbf6e7cc6e89"} Nov 25 12:25:59 crc kubenswrapper[4693]: I1125 12:25:59.915915 4693 generic.go:334] "Generic (PLEG): container finished" podID="dcb107e2-5742-4030-a7fc-a8eb016f449b" containerID="20aea4082df2dee881c850d7220bcf6413466cc6d9622f80568f4609d4cda435" exitCode=0 Nov 25 12:25:59 crc kubenswrapper[4693]: I1125 12:25:59.915959 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dcb107e2-5742-4030-a7fc-a8eb016f449b","Type":"ContainerDied","Data":"20aea4082df2dee881c850d7220bcf6413466cc6d9622f80568f4609d4cda435"} Nov 25 12:26:00 crc kubenswrapper[4693]: I1125 12:26:00.927997 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dcb107e2-5742-4030-a7fc-a8eb016f449b","Type":"ContainerStarted","Data":"a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b"} Nov 25 12:26:00 crc kubenswrapper[4693]: I1125 12:26:00.928629 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 25 12:26:00 crc kubenswrapper[4693]: I1125 12:26:00.933008 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4cd38986-be2a-4adf-b594-352740498acd","Type":"ContainerStarted","Data":"a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac"} Nov 25 12:26:00 crc kubenswrapper[4693]: I1125 12:26:00.933573 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:26:00 crc kubenswrapper[4693]: I1125 12:26:00.961101 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.240046539 podStartE2EDuration="1m3.961081264s" podCreationTimestamp="2025-11-25 12:24:57 +0000 UTC" firstStartedPulling="2025-11-25 12:24:59.398548662 +0000 UTC m=+1019.316634043" lastFinishedPulling="2025-11-25 12:25:26.119583386 +0000 UTC m=+1046.037668768" observedRunningTime="2025-11-25 12:26:00.950788652 +0000 UTC m=+1080.868874043" watchObservedRunningTime="2025-11-25 12:26:00.961081264 +0000 UTC m=+1080.879166645" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.033330 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:26:01 crc kubenswrapper[4693]: E1125 12:26:01.033712 4693 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 25 12:26:01 crc kubenswrapper[4693]: E1125 12:26:01.033736 4693 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 25 12:26:01 crc kubenswrapper[4693]: E1125 12:26:01.033777 4693 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift podName:c8b28a97-55d7-41b0-aa09-55e4e132bd64 nodeName:}" failed. No retries permitted until 2025-11-25 12:26:17.033761446 +0000 UTC m=+1096.951846827 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift") pod "swift-storage-0" (UID: "c8b28a97-55d7-41b0-aa09-55e4e132bd64") : configmap "swift-ring-files" not found Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.163108 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c7b6b5695-zlwtm" podUID="1fc0ef03-f282-4208-a9b0-62d963459644" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.801127 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.453270953 podStartE2EDuration="1m4.801110679s" podCreationTimestamp="2025-11-25 12:24:57 +0000 UTC" firstStartedPulling="2025-11-25 12:24:59.732006123 +0000 UTC m=+1019.650091504" lastFinishedPulling="2025-11-25 12:25:26.079845849 +0000 UTC m=+1045.997931230" observedRunningTime="2025-11-25 12:26:00.987178924 +0000 UTC m=+1080.905264335" watchObservedRunningTime="2025-11-25 12:26:01.801110679 +0000 UTC m=+1081.719196060" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.801714 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-1059-account-create-r8kpj"] Nov 25 12:26:01 crc kubenswrapper[4693]: E1125 12:26:01.802034 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc0ef03-f282-4208-a9b0-62d963459644" containerName="init" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.802051 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc0ef03-f282-4208-a9b0-62d963459644" containerName="init" Nov 25 12:26:01 crc kubenswrapper[4693]: E1125 12:26:01.802076 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc0ef03-f282-4208-a9b0-62d963459644" containerName="dnsmasq-dns" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.802083 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc0ef03-f282-4208-a9b0-62d963459644" containerName="dnsmasq-dns" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.802239 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc0ef03-f282-4208-a9b0-62d963459644" containerName="dnsmasq-dns" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.802730 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1059-account-create-r8kpj" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.804461 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.825920 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1059-account-create-r8kpj"] Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.866850 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-b8msz"] Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.867755 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-b8msz" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.875363 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-b8msz"] Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.941356 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2kzct" event={"ID":"88ff5ba0-ea04-4e77-9f16-05711082df93","Type":"ContainerStarted","Data":"b32638caf1a6ac5083e81d093aad431f1e6c923b7c994f4ceee3d35b45dc6d63"} Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.946169 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7jz5\" (UniqueName: \"kubernetes.io/projected/60249003-1066-44a7-acae-8c8482813b62-kube-api-access-p7jz5\") pod \"keystone-1059-account-create-r8kpj\" (UID: \"60249003-1066-44a7-acae-8c8482813b62\") " pod="openstack/keystone-1059-account-create-r8kpj" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.946279 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1353fe13-196b-4d9d-9217-8b3b8000f38d-operator-scripts\") pod \"keystone-db-create-b8msz\" (UID: \"1353fe13-196b-4d9d-9217-8b3b8000f38d\") " pod="openstack/keystone-db-create-b8msz" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.946330 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60249003-1066-44a7-acae-8c8482813b62-operator-scripts\") pod \"keystone-1059-account-create-r8kpj\" (UID: \"60249003-1066-44a7-acae-8c8482813b62\") " pod="openstack/keystone-1059-account-create-r8kpj" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.946349 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvznv\" (UniqueName: \"kubernetes.io/projected/1353fe13-196b-4d9d-9217-8b3b8000f38d-kube-api-access-wvznv\") pod \"keystone-db-create-b8msz\" (UID: \"1353fe13-196b-4d9d-9217-8b3b8000f38d\") " pod="openstack/keystone-db-create-b8msz" Nov 25 12:26:01 crc kubenswrapper[4693]: I1125 12:26:01.964669 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2kzct" podStartSLOduration=2.816525229 podStartE2EDuration="16.964651508s" podCreationTimestamp="2025-11-25 12:25:45 +0000 UTC" firstStartedPulling="2025-11-25 12:25:46.641195932 +0000 UTC m=+1066.559281323" lastFinishedPulling="2025-11-25 12:26:00.789322221 +0000 UTC m=+1080.707407602" observedRunningTime="2025-11-25 12:26:01.958596817 +0000 UTC m=+1081.876682198" watchObservedRunningTime="2025-11-25 12:26:01.964651508 +0000 UTC m=+1081.882736889" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.048065 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60249003-1066-44a7-acae-8c8482813b62-operator-scripts\") pod \"keystone-1059-account-create-r8kpj\" (UID: \"60249003-1066-44a7-acae-8c8482813b62\") " pod="openstack/keystone-1059-account-create-r8kpj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.049303 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvznv\" (UniqueName: \"kubernetes.io/projected/1353fe13-196b-4d9d-9217-8b3b8000f38d-kube-api-access-wvznv\") pod \"keystone-db-create-b8msz\" (UID: \"1353fe13-196b-4d9d-9217-8b3b8000f38d\") " pod="openstack/keystone-db-create-b8msz" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.049807 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7jz5\" (UniqueName: \"kubernetes.io/projected/60249003-1066-44a7-acae-8c8482813b62-kube-api-access-p7jz5\") pod \"keystone-1059-account-create-r8kpj\" (UID: \"60249003-1066-44a7-acae-8c8482813b62\") " pod="openstack/keystone-1059-account-create-r8kpj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.049253 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60249003-1066-44a7-acae-8c8482813b62-operator-scripts\") pod \"keystone-1059-account-create-r8kpj\" (UID: \"60249003-1066-44a7-acae-8c8482813b62\") " pod="openstack/keystone-1059-account-create-r8kpj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.050837 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1353fe13-196b-4d9d-9217-8b3b8000f38d-operator-scripts\") pod \"keystone-db-create-b8msz\" (UID: \"1353fe13-196b-4d9d-9217-8b3b8000f38d\") " pod="openstack/keystone-db-create-b8msz" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.051616 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1353fe13-196b-4d9d-9217-8b3b8000f38d-operator-scripts\") pod \"keystone-db-create-b8msz\" (UID: \"1353fe13-196b-4d9d-9217-8b3b8000f38d\") " pod="openstack/keystone-db-create-b8msz" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.060949 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-zzzmj"] Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.061884 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzzmj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.069054 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvznv\" (UniqueName: \"kubernetes.io/projected/1353fe13-196b-4d9d-9217-8b3b8000f38d-kube-api-access-wvznv\") pod \"keystone-db-create-b8msz\" (UID: \"1353fe13-196b-4d9d-9217-8b3b8000f38d\") " pod="openstack/keystone-db-create-b8msz" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.070220 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zzzmj"] Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.082300 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7jz5\" (UniqueName: \"kubernetes.io/projected/60249003-1066-44a7-acae-8c8482813b62-kube-api-access-p7jz5\") pod \"keystone-1059-account-create-r8kpj\" (UID: \"60249003-1066-44a7-acae-8c8482813b62\") " pod="openstack/keystone-1059-account-create-r8kpj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.120522 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1059-account-create-r8kpj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.152029 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f3509e5-23a3-440b-8160-8409e8127a8e-operator-scripts\") pod \"placement-db-create-zzzmj\" (UID: \"9f3509e5-23a3-440b-8160-8409e8127a8e\") " pod="openstack/placement-db-create-zzzmj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.152148 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzmv2\" (UniqueName: \"kubernetes.io/projected/9f3509e5-23a3-440b-8160-8409e8127a8e-kube-api-access-mzmv2\") pod \"placement-db-create-zzzmj\" (UID: \"9f3509e5-23a3-440b-8160-8409e8127a8e\") " pod="openstack/placement-db-create-zzzmj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.175551 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3c6c-account-create-pd4xx"] Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.176798 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c6c-account-create-pd4xx" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.178915 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.183911 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-b8msz" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.236798 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3c6c-account-create-pd4xx"] Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.254681 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzmv2\" (UniqueName: \"kubernetes.io/projected/9f3509e5-23a3-440b-8160-8409e8127a8e-kube-api-access-mzmv2\") pod \"placement-db-create-zzzmj\" (UID: \"9f3509e5-23a3-440b-8160-8409e8127a8e\") " pod="openstack/placement-db-create-zzzmj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.254831 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a9dc298-2fa0-40fe-b98a-44653c91782a-operator-scripts\") pod \"placement-3c6c-account-create-pd4xx\" (UID: \"8a9dc298-2fa0-40fe-b98a-44653c91782a\") " pod="openstack/placement-3c6c-account-create-pd4xx" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.254952 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvxs7\" (UniqueName: \"kubernetes.io/projected/8a9dc298-2fa0-40fe-b98a-44653c91782a-kube-api-access-jvxs7\") pod \"placement-3c6c-account-create-pd4xx\" (UID: \"8a9dc298-2fa0-40fe-b98a-44653c91782a\") " pod="openstack/placement-3c6c-account-create-pd4xx" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.255033 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f3509e5-23a3-440b-8160-8409e8127a8e-operator-scripts\") pod \"placement-db-create-zzzmj\" (UID: \"9f3509e5-23a3-440b-8160-8409e8127a8e\") " pod="openstack/placement-db-create-zzzmj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.256217 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f3509e5-23a3-440b-8160-8409e8127a8e-operator-scripts\") pod \"placement-db-create-zzzmj\" (UID: \"9f3509e5-23a3-440b-8160-8409e8127a8e\") " pod="openstack/placement-db-create-zzzmj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.278427 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzmv2\" (UniqueName: \"kubernetes.io/projected/9f3509e5-23a3-440b-8160-8409e8127a8e-kube-api-access-mzmv2\") pod \"placement-db-create-zzzmj\" (UID: \"9f3509e5-23a3-440b-8160-8409e8127a8e\") " pod="openstack/placement-db-create-zzzmj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.329134 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9sx2x"] Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.331312 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9sx2x" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.348140 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9sx2x"] Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.355976 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a9dc298-2fa0-40fe-b98a-44653c91782a-operator-scripts\") pod \"placement-3c6c-account-create-pd4xx\" (UID: \"8a9dc298-2fa0-40fe-b98a-44653c91782a\") " pod="openstack/placement-3c6c-account-create-pd4xx" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.356062 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvxs7\" (UniqueName: \"kubernetes.io/projected/8a9dc298-2fa0-40fe-b98a-44653c91782a-kube-api-access-jvxs7\") pod \"placement-3c6c-account-create-pd4xx\" (UID: \"8a9dc298-2fa0-40fe-b98a-44653c91782a\") " pod="openstack/placement-3c6c-account-create-pd4xx" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.357292 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a9dc298-2fa0-40fe-b98a-44653c91782a-operator-scripts\") pod \"placement-3c6c-account-create-pd4xx\" (UID: \"8a9dc298-2fa0-40fe-b98a-44653c91782a\") " pod="openstack/placement-3c6c-account-create-pd4xx" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.381240 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvxs7\" (UniqueName: \"kubernetes.io/projected/8a9dc298-2fa0-40fe-b98a-44653c91782a-kube-api-access-jvxs7\") pod \"placement-3c6c-account-create-pd4xx\" (UID: \"8a9dc298-2fa0-40fe-b98a-44653c91782a\") " pod="openstack/placement-3c6c-account-create-pd4xx" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.457346 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a119ae70-86da-487a-baf9-d09d2a36d4bb-operator-scripts\") pod \"glance-db-create-9sx2x\" (UID: \"a119ae70-86da-487a-baf9-d09d2a36d4bb\") " pod="openstack/glance-db-create-9sx2x" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.457519 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qrpw\" (UniqueName: \"kubernetes.io/projected/a119ae70-86da-487a-baf9-d09d2a36d4bb-kube-api-access-2qrpw\") pod \"glance-db-create-9sx2x\" (UID: \"a119ae70-86da-487a-baf9-d09d2a36d4bb\") " pod="openstack/glance-db-create-9sx2x" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.476336 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-243d-account-create-fgqqs"] Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.479619 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-243d-account-create-fgqqs" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.481856 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.483242 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-243d-account-create-fgqqs"] Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.504522 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzzmj" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.560778 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a119ae70-86da-487a-baf9-d09d2a36d4bb-operator-scripts\") pod \"glance-db-create-9sx2x\" (UID: \"a119ae70-86da-487a-baf9-d09d2a36d4bb\") " pod="openstack/glance-db-create-9sx2x" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.560877 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq64g\" (UniqueName: \"kubernetes.io/projected/09113a70-8e33-4573-978c-6fe86fa93b2f-kube-api-access-qq64g\") pod \"glance-243d-account-create-fgqqs\" (UID: \"09113a70-8e33-4573-978c-6fe86fa93b2f\") " pod="openstack/glance-243d-account-create-fgqqs" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.560921 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09113a70-8e33-4573-978c-6fe86fa93b2f-operator-scripts\") pod \"glance-243d-account-create-fgqqs\" (UID: \"09113a70-8e33-4573-978c-6fe86fa93b2f\") " pod="openstack/glance-243d-account-create-fgqqs" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.561134 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qrpw\" (UniqueName: \"kubernetes.io/projected/a119ae70-86da-487a-baf9-d09d2a36d4bb-kube-api-access-2qrpw\") pod \"glance-db-create-9sx2x\" (UID: \"a119ae70-86da-487a-baf9-d09d2a36d4bb\") " pod="openstack/glance-db-create-9sx2x" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.562004 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a119ae70-86da-487a-baf9-d09d2a36d4bb-operator-scripts\") pod \"glance-db-create-9sx2x\" (UID: \"a119ae70-86da-487a-baf9-d09d2a36d4bb\") " pod="openstack/glance-db-create-9sx2x" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.579976 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qrpw\" (UniqueName: \"kubernetes.io/projected/a119ae70-86da-487a-baf9-d09d2a36d4bb-kube-api-access-2qrpw\") pod \"glance-db-create-9sx2x\" (UID: \"a119ae70-86da-487a-baf9-d09d2a36d4bb\") " pod="openstack/glance-db-create-9sx2x" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.591990 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c6c-account-create-pd4xx" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.656300 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9sx2x" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.662808 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq64g\" (UniqueName: \"kubernetes.io/projected/09113a70-8e33-4573-978c-6fe86fa93b2f-kube-api-access-qq64g\") pod \"glance-243d-account-create-fgqqs\" (UID: \"09113a70-8e33-4573-978c-6fe86fa93b2f\") " pod="openstack/glance-243d-account-create-fgqqs" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.662870 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09113a70-8e33-4573-978c-6fe86fa93b2f-operator-scripts\") pod \"glance-243d-account-create-fgqqs\" (UID: \"09113a70-8e33-4573-978c-6fe86fa93b2f\") " pod="openstack/glance-243d-account-create-fgqqs" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.663868 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09113a70-8e33-4573-978c-6fe86fa93b2f-operator-scripts\") pod \"glance-243d-account-create-fgqqs\" (UID: \"09113a70-8e33-4573-978c-6fe86fa93b2f\") " pod="openstack/glance-243d-account-create-fgqqs" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.667214 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-b8msz"] Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.681541 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq64g\" (UniqueName: \"kubernetes.io/projected/09113a70-8e33-4573-978c-6fe86fa93b2f-kube-api-access-qq64g\") pod \"glance-243d-account-create-fgqqs\" (UID: \"09113a70-8e33-4573-978c-6fe86fa93b2f\") " pod="openstack/glance-243d-account-create-fgqqs" Nov 25 12:26:02 crc kubenswrapper[4693]: W1125 12:26:02.710619 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1353fe13_196b_4d9d_9217_8b3b8000f38d.slice/crio-e2912a4995feffb962425fb93c176f24af33a49317db205f006057a05d11da47 WatchSource:0}: Error finding container e2912a4995feffb962425fb93c176f24af33a49317db205f006057a05d11da47: Status 404 returned error can't find the container with id e2912a4995feffb962425fb93c176f24af33a49317db205f006057a05d11da47 Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.718149 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-1059-account-create-r8kpj"] Nov 25 12:26:02 crc kubenswrapper[4693]: W1125 12:26:02.721112 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60249003_1066_44a7_acae_8c8482813b62.slice/crio-e84cc4a15f881fd30f36dee690a3534cffd433a304c86f420ea5a5c730aa2943 WatchSource:0}: Error finding container e84cc4a15f881fd30f36dee690a3534cffd433a304c86f420ea5a5c730aa2943: Status 404 returned error can't find the container with id e84cc4a15f881fd30f36dee690a3534cffd433a304c86f420ea5a5c730aa2943 Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.800206 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-243d-account-create-fgqqs" Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.953814 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zzzmj"] Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.958667 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-b8msz" event={"ID":"1353fe13-196b-4d9d-9217-8b3b8000f38d","Type":"ContainerStarted","Data":"e2912a4995feffb962425fb93c176f24af33a49317db205f006057a05d11da47"} Nov 25 12:26:02 crc kubenswrapper[4693]: I1125 12:26:02.962065 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1059-account-create-r8kpj" event={"ID":"60249003-1066-44a7-acae-8c8482813b62","Type":"ContainerStarted","Data":"e84cc4a15f881fd30f36dee690a3534cffd433a304c86f420ea5a5c730aa2943"} Nov 25 12:26:03 crc kubenswrapper[4693]: W1125 12:26:03.113269 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a9dc298_2fa0_40fe_b98a_44653c91782a.slice/crio-1125c2d10832032e1f1c73c778c141e5922ae0448a116cd61bbfbe56314098c0 WatchSource:0}: Error finding container 1125c2d10832032e1f1c73c778c141e5922ae0448a116cd61bbfbe56314098c0: Status 404 returned error can't find the container with id 1125c2d10832032e1f1c73c778c141e5922ae0448a116cd61bbfbe56314098c0 Nov 25 12:26:03 crc kubenswrapper[4693]: I1125 12:26:03.113678 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3c6c-account-create-pd4xx"] Nov 25 12:26:03 crc kubenswrapper[4693]: I1125 12:26:03.241680 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9sx2x"] Nov 25 12:26:03 crc kubenswrapper[4693]: W1125 12:26:03.245200 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda119ae70_86da_487a_baf9_d09d2a36d4bb.slice/crio-114330380b1c62d6c3e8a8d3faecc8b5dcd9fdf1d7f4c31ce0a402c6b2d3170a WatchSource:0}: Error finding container 114330380b1c62d6c3e8a8d3faecc8b5dcd9fdf1d7f4c31ce0a402c6b2d3170a: Status 404 returned error can't find the container with id 114330380b1c62d6c3e8a8d3faecc8b5dcd9fdf1d7f4c31ce0a402c6b2d3170a Nov 25 12:26:03 crc kubenswrapper[4693]: I1125 12:26:03.304469 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-243d-account-create-fgqqs"] Nov 25 12:26:03 crc kubenswrapper[4693]: W1125 12:26:03.313964 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09113a70_8e33_4573_978c_6fe86fa93b2f.slice/crio-cb6006e5cb824ce3dbbcf4015bbab4fd67086782225d7d59afcd3d06c53c337b WatchSource:0}: Error finding container cb6006e5cb824ce3dbbcf4015bbab4fd67086782225d7d59afcd3d06c53c337b: Status 404 returned error can't find the container with id cb6006e5cb824ce3dbbcf4015bbab4fd67086782225d7d59afcd3d06c53c337b Nov 25 12:26:03 crc kubenswrapper[4693]: I1125 12:26:03.969914 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-243d-account-create-fgqqs" event={"ID":"09113a70-8e33-4573-978c-6fe86fa93b2f","Type":"ContainerStarted","Data":"cb6006e5cb824ce3dbbcf4015bbab4fd67086782225d7d59afcd3d06c53c337b"} Nov 25 12:26:03 crc kubenswrapper[4693]: I1125 12:26:03.971335 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzzmj" event={"ID":"9f3509e5-23a3-440b-8160-8409e8127a8e","Type":"ContainerStarted","Data":"79cd2e9b3b2c9b73130f0eb167281d5a3d821cceee89919e5c524c269c3ab5dc"} Nov 25 12:26:03 crc kubenswrapper[4693]: I1125 12:26:03.972341 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c6c-account-create-pd4xx" event={"ID":"8a9dc298-2fa0-40fe-b98a-44653c91782a","Type":"ContainerStarted","Data":"1125c2d10832032e1f1c73c778c141e5922ae0448a116cd61bbfbe56314098c0"} Nov 25 12:26:03 crc kubenswrapper[4693]: I1125 12:26:03.973232 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9sx2x" event={"ID":"a119ae70-86da-487a-baf9-d09d2a36d4bb","Type":"ContainerStarted","Data":"114330380b1c62d6c3e8a8d3faecc8b5dcd9fdf1d7f4c31ce0a402c6b2d3170a"} Nov 25 12:26:04 crc kubenswrapper[4693]: I1125 12:26:04.980079 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzzmj" event={"ID":"9f3509e5-23a3-440b-8160-8409e8127a8e","Type":"ContainerStarted","Data":"9631b19ceec3e8e86179bc5242a8c10046fece462cd98cc3cf1577a51a0d671d"} Nov 25 12:26:04 crc kubenswrapper[4693]: I1125 12:26:04.981691 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1059-account-create-r8kpj" event={"ID":"60249003-1066-44a7-acae-8c8482813b62","Type":"ContainerStarted","Data":"2b22364c913b0fce382f695a146c711935c6249421c293cdd3383d82eff09c73"} Nov 25 12:26:04 crc kubenswrapper[4693]: I1125 12:26:04.984034 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c6c-account-create-pd4xx" event={"ID":"8a9dc298-2fa0-40fe-b98a-44653c91782a","Type":"ContainerStarted","Data":"e623597a9f52924e3ea9438a620c605afefdd96a0bfe0467469078bed0abb383"} Nov 25 12:26:04 crc kubenswrapper[4693]: I1125 12:26:04.986266 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9sx2x" event={"ID":"a119ae70-86da-487a-baf9-d09d2a36d4bb","Type":"ContainerStarted","Data":"e1c00ddd28fc06e8314b731ab4ee7f42f1eae04e4727d3aebdb3dbf34bf8fc2e"} Nov 25 12:26:04 crc kubenswrapper[4693]: I1125 12:26:04.988068 4693 generic.go:334] "Generic (PLEG): container finished" podID="1353fe13-196b-4d9d-9217-8b3b8000f38d" containerID="2ac60ef80e7c440117c6fddd90ff6e40f0560049991e62afd1f5bcc2b45180d5" exitCode=0 Nov 25 12:26:04 crc kubenswrapper[4693]: I1125 12:26:04.988123 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-b8msz" event={"ID":"1353fe13-196b-4d9d-9217-8b3b8000f38d","Type":"ContainerDied","Data":"2ac60ef80e7c440117c6fddd90ff6e40f0560049991e62afd1f5bcc2b45180d5"} Nov 25 12:26:04 crc kubenswrapper[4693]: I1125 12:26:04.989453 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-243d-account-create-fgqqs" event={"ID":"09113a70-8e33-4573-978c-6fe86fa93b2f","Type":"ContainerStarted","Data":"9657f26267d390371f31a92e624cb49876ca37a1f193faa8f25187c2cef5ab7b"} Nov 25 12:26:04 crc kubenswrapper[4693]: I1125 12:26:04.996691 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-zzzmj" podStartSLOduration=2.996672757 podStartE2EDuration="2.996672757s" podCreationTimestamp="2025-11-25 12:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:04.994168076 +0000 UTC m=+1084.912253457" watchObservedRunningTime="2025-11-25 12:26:04.996672757 +0000 UTC m=+1084.914758138" Nov 25 12:26:05 crc kubenswrapper[4693]: I1125 12:26:05.012449 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-3c6c-account-create-pd4xx" podStartSLOduration=3.012430904 podStartE2EDuration="3.012430904s" podCreationTimestamp="2025-11-25 12:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:05.011265051 +0000 UTC m=+1084.929350432" watchObservedRunningTime="2025-11-25 12:26:05.012430904 +0000 UTC m=+1084.930516285" Nov 25 12:26:05 crc kubenswrapper[4693]: I1125 12:26:05.025388 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-9sx2x" podStartSLOduration=3.02534951 podStartE2EDuration="3.02534951s" podCreationTimestamp="2025-11-25 12:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:05.023045816 +0000 UTC m=+1084.941131207" watchObservedRunningTime="2025-11-25 12:26:05.02534951 +0000 UTC m=+1084.943434891" Nov 25 12:26:05 crc kubenswrapper[4693]: I1125 12:26:05.044056 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-243d-account-create-fgqqs" podStartSLOduration=3.044036131 podStartE2EDuration="3.044036131s" podCreationTimestamp="2025-11-25 12:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:05.042652682 +0000 UTC m=+1084.960738073" watchObservedRunningTime="2025-11-25 12:26:05.044036131 +0000 UTC m=+1084.962121512" Nov 25 12:26:05 crc kubenswrapper[4693]: I1125 12:26:05.090656 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-1059-account-create-r8kpj" podStartSLOduration=4.090631743 podStartE2EDuration="4.090631743s" podCreationTimestamp="2025-11-25 12:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:05.08347096 +0000 UTC m=+1085.001556351" watchObservedRunningTime="2025-11-25 12:26:05.090631743 +0000 UTC m=+1085.008717124" Nov 25 12:26:05 crc kubenswrapper[4693]: I1125 12:26:05.997287 4693 generic.go:334] "Generic (PLEG): container finished" podID="60249003-1066-44a7-acae-8c8482813b62" containerID="2b22364c913b0fce382f695a146c711935c6249421c293cdd3383d82eff09c73" exitCode=0 Nov 25 12:26:05 crc kubenswrapper[4693]: I1125 12:26:05.997369 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1059-account-create-r8kpj" event={"ID":"60249003-1066-44a7-acae-8c8482813b62","Type":"ContainerDied","Data":"2b22364c913b0fce382f695a146c711935c6249421c293cdd3383d82eff09c73"} Nov 25 12:26:05 crc kubenswrapper[4693]: I1125 12:26:05.998676 4693 generic.go:334] "Generic (PLEG): container finished" podID="8a9dc298-2fa0-40fe-b98a-44653c91782a" containerID="e623597a9f52924e3ea9438a620c605afefdd96a0bfe0467469078bed0abb383" exitCode=0 Nov 25 12:26:05 crc kubenswrapper[4693]: I1125 12:26:05.998723 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c6c-account-create-pd4xx" event={"ID":"8a9dc298-2fa0-40fe-b98a-44653c91782a","Type":"ContainerDied","Data":"e623597a9f52924e3ea9438a620c605afefdd96a0bfe0467469078bed0abb383"} Nov 25 12:26:05 crc kubenswrapper[4693]: I1125 12:26:05.999806 4693 generic.go:334] "Generic (PLEG): container finished" podID="a119ae70-86da-487a-baf9-d09d2a36d4bb" containerID="e1c00ddd28fc06e8314b731ab4ee7f42f1eae04e4727d3aebdb3dbf34bf8fc2e" exitCode=0 Nov 25 12:26:05 crc kubenswrapper[4693]: I1125 12:26:05.999828 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9sx2x" event={"ID":"a119ae70-86da-487a-baf9-d09d2a36d4bb","Type":"ContainerDied","Data":"e1c00ddd28fc06e8314b731ab4ee7f42f1eae04e4727d3aebdb3dbf34bf8fc2e"} Nov 25 12:26:06 crc kubenswrapper[4693]: I1125 12:26:06.001107 4693 generic.go:334] "Generic (PLEG): container finished" podID="09113a70-8e33-4573-978c-6fe86fa93b2f" containerID="9657f26267d390371f31a92e624cb49876ca37a1f193faa8f25187c2cef5ab7b" exitCode=0 Nov 25 12:26:06 crc kubenswrapper[4693]: I1125 12:26:06.001134 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-243d-account-create-fgqqs" event={"ID":"09113a70-8e33-4573-978c-6fe86fa93b2f","Type":"ContainerDied","Data":"9657f26267d390371f31a92e624cb49876ca37a1f193faa8f25187c2cef5ab7b"} Nov 25 12:26:06 crc kubenswrapper[4693]: I1125 12:26:06.002338 4693 generic.go:334] "Generic (PLEG): container finished" podID="9f3509e5-23a3-440b-8160-8409e8127a8e" containerID="9631b19ceec3e8e86179bc5242a8c10046fece462cd98cc3cf1577a51a0d671d" exitCode=0 Nov 25 12:26:06 crc kubenswrapper[4693]: I1125 12:26:06.002393 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzzmj" event={"ID":"9f3509e5-23a3-440b-8160-8409e8127a8e","Type":"ContainerDied","Data":"9631b19ceec3e8e86179bc5242a8c10046fece462cd98cc3cf1577a51a0d671d"} Nov 25 12:26:06 crc kubenswrapper[4693]: I1125 12:26:06.370691 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-b8msz" Nov 25 12:26:06 crc kubenswrapper[4693]: I1125 12:26:06.435407 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1353fe13-196b-4d9d-9217-8b3b8000f38d-operator-scripts\") pod \"1353fe13-196b-4d9d-9217-8b3b8000f38d\" (UID: \"1353fe13-196b-4d9d-9217-8b3b8000f38d\") " Nov 25 12:26:06 crc kubenswrapper[4693]: I1125 12:26:06.435883 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvznv\" (UniqueName: \"kubernetes.io/projected/1353fe13-196b-4d9d-9217-8b3b8000f38d-kube-api-access-wvznv\") pod \"1353fe13-196b-4d9d-9217-8b3b8000f38d\" (UID: \"1353fe13-196b-4d9d-9217-8b3b8000f38d\") " Nov 25 12:26:06 crc kubenswrapper[4693]: I1125 12:26:06.436025 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1353fe13-196b-4d9d-9217-8b3b8000f38d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1353fe13-196b-4d9d-9217-8b3b8000f38d" (UID: "1353fe13-196b-4d9d-9217-8b3b8000f38d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:06 crc kubenswrapper[4693]: I1125 12:26:06.436339 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1353fe13-196b-4d9d-9217-8b3b8000f38d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:06 crc kubenswrapper[4693]: I1125 12:26:06.442550 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1353fe13-196b-4d9d-9217-8b3b8000f38d-kube-api-access-wvznv" (OuterVolumeSpecName: "kube-api-access-wvznv") pod "1353fe13-196b-4d9d-9217-8b3b8000f38d" (UID: "1353fe13-196b-4d9d-9217-8b3b8000f38d"). InnerVolumeSpecName "kube-api-access-wvznv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:06 crc kubenswrapper[4693]: I1125 12:26:06.538006 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvznv\" (UniqueName: \"kubernetes.io/projected/1353fe13-196b-4d9d-9217-8b3b8000f38d-kube-api-access-wvznv\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.010945 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-b8msz" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.011697 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-b8msz" event={"ID":"1353fe13-196b-4d9d-9217-8b3b8000f38d","Type":"ContainerDied","Data":"e2912a4995feffb962425fb93c176f24af33a49317db205f006057a05d11da47"} Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.011726 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2912a4995feffb962425fb93c176f24af33a49317db205f006057a05d11da47" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.312132 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzzmj" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.351530 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f3509e5-23a3-440b-8160-8409e8127a8e-operator-scripts\") pod \"9f3509e5-23a3-440b-8160-8409e8127a8e\" (UID: \"9f3509e5-23a3-440b-8160-8409e8127a8e\") " Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.351700 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzmv2\" (UniqueName: \"kubernetes.io/projected/9f3509e5-23a3-440b-8160-8409e8127a8e-kube-api-access-mzmv2\") pod \"9f3509e5-23a3-440b-8160-8409e8127a8e\" (UID: \"9f3509e5-23a3-440b-8160-8409e8127a8e\") " Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.352813 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3509e5-23a3-440b-8160-8409e8127a8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f3509e5-23a3-440b-8160-8409e8127a8e" (UID: "9f3509e5-23a3-440b-8160-8409e8127a8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.368617 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f3509e5-23a3-440b-8160-8409e8127a8e-kube-api-access-mzmv2" (OuterVolumeSpecName: "kube-api-access-mzmv2") pod "9f3509e5-23a3-440b-8160-8409e8127a8e" (UID: "9f3509e5-23a3-440b-8160-8409e8127a8e"). InnerVolumeSpecName "kube-api-access-mzmv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.453730 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f3509e5-23a3-440b-8160-8409e8127a8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.453790 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzmv2\" (UniqueName: \"kubernetes.io/projected/9f3509e5-23a3-440b-8160-8409e8127a8e-kube-api-access-mzmv2\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.479437 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c6c-account-create-pd4xx" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.555043 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a9dc298-2fa0-40fe-b98a-44653c91782a-operator-scripts\") pod \"8a9dc298-2fa0-40fe-b98a-44653c91782a\" (UID: \"8a9dc298-2fa0-40fe-b98a-44653c91782a\") " Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.555319 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvxs7\" (UniqueName: \"kubernetes.io/projected/8a9dc298-2fa0-40fe-b98a-44653c91782a-kube-api-access-jvxs7\") pod \"8a9dc298-2fa0-40fe-b98a-44653c91782a\" (UID: \"8a9dc298-2fa0-40fe-b98a-44653c91782a\") " Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.555550 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a9dc298-2fa0-40fe-b98a-44653c91782a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8a9dc298-2fa0-40fe-b98a-44653c91782a" (UID: "8a9dc298-2fa0-40fe-b98a-44653c91782a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.556013 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a9dc298-2fa0-40fe-b98a-44653c91782a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.560199 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9dc298-2fa0-40fe-b98a-44653c91782a-kube-api-access-jvxs7" (OuterVolumeSpecName: "kube-api-access-jvxs7") pod "8a9dc298-2fa0-40fe-b98a-44653c91782a" (UID: "8a9dc298-2fa0-40fe-b98a-44653c91782a"). InnerVolumeSpecName "kube-api-access-jvxs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.658342 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvxs7\" (UniqueName: \"kubernetes.io/projected/8a9dc298-2fa0-40fe-b98a-44653c91782a-kube-api-access-jvxs7\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.668365 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-243d-account-create-fgqqs" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.678041 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1059-account-create-r8kpj" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.682318 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9sx2x" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.759227 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a119ae70-86da-487a-baf9-d09d2a36d4bb-operator-scripts\") pod \"a119ae70-86da-487a-baf9-d09d2a36d4bb\" (UID: \"a119ae70-86da-487a-baf9-d09d2a36d4bb\") " Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.759309 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qrpw\" (UniqueName: \"kubernetes.io/projected/a119ae70-86da-487a-baf9-d09d2a36d4bb-kube-api-access-2qrpw\") pod \"a119ae70-86da-487a-baf9-d09d2a36d4bb\" (UID: \"a119ae70-86da-487a-baf9-d09d2a36d4bb\") " Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.759356 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq64g\" (UniqueName: \"kubernetes.io/projected/09113a70-8e33-4573-978c-6fe86fa93b2f-kube-api-access-qq64g\") pod \"09113a70-8e33-4573-978c-6fe86fa93b2f\" (UID: \"09113a70-8e33-4573-978c-6fe86fa93b2f\") " Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.759417 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7jz5\" (UniqueName: \"kubernetes.io/projected/60249003-1066-44a7-acae-8c8482813b62-kube-api-access-p7jz5\") pod \"60249003-1066-44a7-acae-8c8482813b62\" (UID: \"60249003-1066-44a7-acae-8c8482813b62\") " Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.759510 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60249003-1066-44a7-acae-8c8482813b62-operator-scripts\") pod \"60249003-1066-44a7-acae-8c8482813b62\" (UID: \"60249003-1066-44a7-acae-8c8482813b62\") " Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.759551 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09113a70-8e33-4573-978c-6fe86fa93b2f-operator-scripts\") pod \"09113a70-8e33-4573-978c-6fe86fa93b2f\" (UID: \"09113a70-8e33-4573-978c-6fe86fa93b2f\") " Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.760252 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09113a70-8e33-4573-978c-6fe86fa93b2f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09113a70-8e33-4573-978c-6fe86fa93b2f" (UID: "09113a70-8e33-4573-978c-6fe86fa93b2f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.760728 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a119ae70-86da-487a-baf9-d09d2a36d4bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a119ae70-86da-487a-baf9-d09d2a36d4bb" (UID: "a119ae70-86da-487a-baf9-d09d2a36d4bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.789755 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a119ae70-86da-487a-baf9-d09d2a36d4bb-kube-api-access-2qrpw" (OuterVolumeSpecName: "kube-api-access-2qrpw") pod "a119ae70-86da-487a-baf9-d09d2a36d4bb" (UID: "a119ae70-86da-487a-baf9-d09d2a36d4bb"). InnerVolumeSpecName "kube-api-access-2qrpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.789857 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09113a70-8e33-4573-978c-6fe86fa93b2f-kube-api-access-qq64g" (OuterVolumeSpecName: "kube-api-access-qq64g") pod "09113a70-8e33-4573-978c-6fe86fa93b2f" (UID: "09113a70-8e33-4573-978c-6fe86fa93b2f"). InnerVolumeSpecName "kube-api-access-qq64g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.789964 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60249003-1066-44a7-acae-8c8482813b62-kube-api-access-p7jz5" (OuterVolumeSpecName: "kube-api-access-p7jz5") pod "60249003-1066-44a7-acae-8c8482813b62" (UID: "60249003-1066-44a7-acae-8c8482813b62"). InnerVolumeSpecName "kube-api-access-p7jz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.790113 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60249003-1066-44a7-acae-8c8482813b62-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60249003-1066-44a7-acae-8c8482813b62" (UID: "60249003-1066-44a7-acae-8c8482813b62"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.861072 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qrpw\" (UniqueName: \"kubernetes.io/projected/a119ae70-86da-487a-baf9-d09d2a36d4bb-kube-api-access-2qrpw\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.861413 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq64g\" (UniqueName: \"kubernetes.io/projected/09113a70-8e33-4573-978c-6fe86fa93b2f-kube-api-access-qq64g\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.861509 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7jz5\" (UniqueName: \"kubernetes.io/projected/60249003-1066-44a7-acae-8c8482813b62-kube-api-access-p7jz5\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.861569 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60249003-1066-44a7-acae-8c8482813b62-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.861639 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09113a70-8e33-4573-978c-6fe86fa93b2f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:07 crc kubenswrapper[4693]: I1125 12:26:07.861720 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a119ae70-86da-487a-baf9-d09d2a36d4bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.019970 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-1059-account-create-r8kpj" event={"ID":"60249003-1066-44a7-acae-8c8482813b62","Type":"ContainerDied","Data":"e84cc4a15f881fd30f36dee690a3534cffd433a304c86f420ea5a5c730aa2943"} Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.020030 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e84cc4a15f881fd30f36dee690a3534cffd433a304c86f420ea5a5c730aa2943" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.019994 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-1059-account-create-r8kpj" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.028469 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c6c-account-create-pd4xx" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.028590 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c6c-account-create-pd4xx" event={"ID":"8a9dc298-2fa0-40fe-b98a-44653c91782a","Type":"ContainerDied","Data":"1125c2d10832032e1f1c73c778c141e5922ae0448a116cd61bbfbe56314098c0"} Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.028642 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1125c2d10832032e1f1c73c778c141e5922ae0448a116cd61bbfbe56314098c0" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.030581 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9sx2x" event={"ID":"a119ae70-86da-487a-baf9-d09d2a36d4bb","Type":"ContainerDied","Data":"114330380b1c62d6c3e8a8d3faecc8b5dcd9fdf1d7f4c31ce0a402c6b2d3170a"} Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.030624 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="114330380b1c62d6c3e8a8d3faecc8b5dcd9fdf1d7f4c31ce0a402c6b2d3170a" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.030721 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9sx2x" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.032450 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-243d-account-create-fgqqs" event={"ID":"09113a70-8e33-4573-978c-6fe86fa93b2f","Type":"ContainerDied","Data":"cb6006e5cb824ce3dbbcf4015bbab4fd67086782225d7d59afcd3d06c53c337b"} Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.032513 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb6006e5cb824ce3dbbcf4015bbab4fd67086782225d7d59afcd3d06c53c337b" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.032878 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-243d-account-create-fgqqs" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.034037 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzzmj" event={"ID":"9f3509e5-23a3-440b-8160-8409e8127a8e","Type":"ContainerDied","Data":"79cd2e9b3b2c9b73130f0eb167281d5a3d821cceee89919e5c524c269c3ab5dc"} Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.034090 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79cd2e9b3b2c9b73130f0eb167281d5a3d821cceee89919e5c524c269c3ab5dc" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.034193 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzzmj" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.442954 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ndgsx" podUID="96236f54-53d2-47df-854b-51addeda1dee" containerName="ovn-controller" probeResult="failure" output=< Nov 25 12:26:08 crc kubenswrapper[4693]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 25 12:26:08 crc kubenswrapper[4693]: > Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.470068 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.477217 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8vhnn" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698076 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ndgsx-config-f6qp6"] Nov 25 12:26:08 crc kubenswrapper[4693]: E1125 12:26:08.698522 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60249003-1066-44a7-acae-8c8482813b62" containerName="mariadb-account-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698550 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="60249003-1066-44a7-acae-8c8482813b62" containerName="mariadb-account-create" Nov 25 12:26:08 crc kubenswrapper[4693]: E1125 12:26:08.698566 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9dc298-2fa0-40fe-b98a-44653c91782a" containerName="mariadb-account-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698574 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9dc298-2fa0-40fe-b98a-44653c91782a" containerName="mariadb-account-create" Nov 25 12:26:08 crc kubenswrapper[4693]: E1125 12:26:08.698593 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1353fe13-196b-4d9d-9217-8b3b8000f38d" containerName="mariadb-database-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698600 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1353fe13-196b-4d9d-9217-8b3b8000f38d" containerName="mariadb-database-create" Nov 25 12:26:08 crc kubenswrapper[4693]: E1125 12:26:08.698613 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a119ae70-86da-487a-baf9-d09d2a36d4bb" containerName="mariadb-database-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698620 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a119ae70-86da-487a-baf9-d09d2a36d4bb" containerName="mariadb-database-create" Nov 25 12:26:08 crc kubenswrapper[4693]: E1125 12:26:08.698634 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09113a70-8e33-4573-978c-6fe86fa93b2f" containerName="mariadb-account-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698642 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="09113a70-8e33-4573-978c-6fe86fa93b2f" containerName="mariadb-account-create" Nov 25 12:26:08 crc kubenswrapper[4693]: E1125 12:26:08.698659 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3509e5-23a3-440b-8160-8409e8127a8e" containerName="mariadb-database-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698666 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3509e5-23a3-440b-8160-8409e8127a8e" containerName="mariadb-database-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698863 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a119ae70-86da-487a-baf9-d09d2a36d4bb" containerName="mariadb-database-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698886 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9dc298-2fa0-40fe-b98a-44653c91782a" containerName="mariadb-account-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698901 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f3509e5-23a3-440b-8160-8409e8127a8e" containerName="mariadb-database-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698913 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="60249003-1066-44a7-acae-8c8482813b62" containerName="mariadb-account-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698922 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="09113a70-8e33-4573-978c-6fe86fa93b2f" containerName="mariadb-account-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.698938 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="1353fe13-196b-4d9d-9217-8b3b8000f38d" containerName="mariadb-database-create" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.699626 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.702530 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.717836 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ndgsx-config-f6qp6"] Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.783474 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-log-ovn\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.783547 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnc4f\" (UniqueName: \"kubernetes.io/projected/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-kube-api-access-rnc4f\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.783631 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-additional-scripts\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.783668 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-run\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.783698 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-scripts\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.783726 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-run-ovn\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.885184 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-additional-scripts\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.885259 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-run\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.885296 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-scripts\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.885323 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-run-ovn\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.885476 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-log-ovn\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.885504 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnc4f\" (UniqueName: \"kubernetes.io/projected/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-kube-api-access-rnc4f\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.886136 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-run\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.886147 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-run-ovn\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.886274 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-log-ovn\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.886527 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-additional-scripts\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.888572 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-scripts\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:08 crc kubenswrapper[4693]: I1125 12:26:08.908825 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnc4f\" (UniqueName: \"kubernetes.io/projected/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-kube-api-access-rnc4f\") pod \"ovn-controller-ndgsx-config-f6qp6\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:09 crc kubenswrapper[4693]: I1125 12:26:09.019202 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:09 crc kubenswrapper[4693]: I1125 12:26:09.456453 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ndgsx-config-f6qp6"] Nov 25 12:26:09 crc kubenswrapper[4693]: W1125 12:26:09.461429 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ff1a2e3_b0d9_4895_b93b_e71b84ba7961.slice/crio-4a113c5739db996294125fb07335939d64e4a7c73d78c743ddd953725d07a1a9 WatchSource:0}: Error finding container 4a113c5739db996294125fb07335939d64e4a7c73d78c743ddd953725d07a1a9: Status 404 returned error can't find the container with id 4a113c5739db996294125fb07335939d64e4a7c73d78c743ddd953725d07a1a9 Nov 25 12:26:10 crc kubenswrapper[4693]: I1125 12:26:10.049571 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ndgsx-config-f6qp6" event={"ID":"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961","Type":"ContainerStarted","Data":"4a113c5739db996294125fb07335939d64e4a7c73d78c743ddd953725d07a1a9"} Nov 25 12:26:11 crc kubenswrapper[4693]: I1125 12:26:11.057678 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ndgsx-config-f6qp6" event={"ID":"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961","Type":"ContainerStarted","Data":"dbd83a61272aad32773d0082cb1579c348d783dd55b8c14070663bfe58a4a673"} Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.069611 4693 generic.go:334] "Generic (PLEG): container finished" podID="3ff1a2e3-b0d9-4895-b93b-e71b84ba7961" containerID="dbd83a61272aad32773d0082cb1579c348d783dd55b8c14070663bfe58a4a673" exitCode=0 Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.069661 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ndgsx-config-f6qp6" event={"ID":"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961","Type":"ContainerDied","Data":"dbd83a61272aad32773d0082cb1579c348d783dd55b8c14070663bfe58a4a673"} Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.729332 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-qdmcl"] Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.730716 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.732575 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.732596 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-q2hwj" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.742436 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qdmcl"] Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.756026 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-db-sync-config-data\") pod \"glance-db-sync-qdmcl\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.756084 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-combined-ca-bundle\") pod \"glance-db-sync-qdmcl\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.756125 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vfxm\" (UniqueName: \"kubernetes.io/projected/e8135cf2-4e92-4e70-9c47-f5fae388c0be-kube-api-access-6vfxm\") pod \"glance-db-sync-qdmcl\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.756195 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-config-data\") pod \"glance-db-sync-qdmcl\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.857571 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-config-data\") pod \"glance-db-sync-qdmcl\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.857705 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-db-sync-config-data\") pod \"glance-db-sync-qdmcl\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.857729 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-combined-ca-bundle\") pod \"glance-db-sync-qdmcl\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.857749 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vfxm\" (UniqueName: \"kubernetes.io/projected/e8135cf2-4e92-4e70-9c47-f5fae388c0be-kube-api-access-6vfxm\") pod \"glance-db-sync-qdmcl\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.867503 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-db-sync-config-data\") pod \"glance-db-sync-qdmcl\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.867688 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-config-data\") pod \"glance-db-sync-qdmcl\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.879090 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vfxm\" (UniqueName: \"kubernetes.io/projected/e8135cf2-4e92-4e70-9c47-f5fae388c0be-kube-api-access-6vfxm\") pod \"glance-db-sync-qdmcl\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:12 crc kubenswrapper[4693]: I1125 12:26:12.880840 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-combined-ca-bundle\") pod \"glance-db-sync-qdmcl\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.050191 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qdmcl" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.082562 4693 generic.go:334] "Generic (PLEG): container finished" podID="88ff5ba0-ea04-4e77-9f16-05711082df93" containerID="b32638caf1a6ac5083e81d093aad431f1e6c923b7c994f4ceee3d35b45dc6d63" exitCode=0 Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.082630 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2kzct" event={"ID":"88ff5ba0-ea04-4e77-9f16-05711082df93","Type":"ContainerDied","Data":"b32638caf1a6ac5083e81d093aad431f1e6c923b7c994f4ceee3d35b45dc6d63"} Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.388765 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.450850 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ndgsx" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.465520 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-run-ovn\") pod \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.465557 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-scripts\") pod \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.465616 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnc4f\" (UniqueName: \"kubernetes.io/projected/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-kube-api-access-rnc4f\") pod \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.465668 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-run\") pod \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.465687 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-additional-scripts\") pod \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.465725 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-log-ovn\") pod \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\" (UID: \"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961\") " Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.465829 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-run" (OuterVolumeSpecName: "var-run") pod "3ff1a2e3-b0d9-4895-b93b-e71b84ba7961" (UID: "3ff1a2e3-b0d9-4895-b93b-e71b84ba7961"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.465878 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3ff1a2e3-b0d9-4895-b93b-e71b84ba7961" (UID: "3ff1a2e3-b0d9-4895-b93b-e71b84ba7961"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.466400 4693 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-run\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.466419 4693 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.466483 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3ff1a2e3-b0d9-4895-b93b-e71b84ba7961" (UID: "3ff1a2e3-b0d9-4895-b93b-e71b84ba7961"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.467024 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3ff1a2e3-b0d9-4895-b93b-e71b84ba7961" (UID: "3ff1a2e3-b0d9-4895-b93b-e71b84ba7961"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.467807 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-scripts" (OuterVolumeSpecName: "scripts") pod "3ff1a2e3-b0d9-4895-b93b-e71b84ba7961" (UID: "3ff1a2e3-b0d9-4895-b93b-e71b84ba7961"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.478151 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-kube-api-access-rnc4f" (OuterVolumeSpecName: "kube-api-access-rnc4f") pod "3ff1a2e3-b0d9-4895-b93b-e71b84ba7961" (UID: "3ff1a2e3-b0d9-4895-b93b-e71b84ba7961"). InnerVolumeSpecName "kube-api-access-rnc4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.567759 4693 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.567799 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.567813 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnc4f\" (UniqueName: \"kubernetes.io/projected/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-kube-api-access-rnc4f\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.567824 4693 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:13 crc kubenswrapper[4693]: I1125 12:26:13.672539 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qdmcl"] Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.090317 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ndgsx-config-f6qp6" event={"ID":"3ff1a2e3-b0d9-4895-b93b-e71b84ba7961","Type":"ContainerDied","Data":"4a113c5739db996294125fb07335939d64e4a7c73d78c743ddd953725d07a1a9"} Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.090366 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a113c5739db996294125fb07335939d64e4a7c73d78c743ddd953725d07a1a9" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.090366 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ndgsx-config-f6qp6" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.091610 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qdmcl" event={"ID":"e8135cf2-4e92-4e70-9c47-f5fae388c0be","Type":"ContainerStarted","Data":"336effd30cf85b5126a77a64c8b26cb86ce36376c66c6054a7caf962fba260db"} Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.398573 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.481696 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/88ff5ba0-ea04-4e77-9f16-05711082df93-ring-data-devices\") pod \"88ff5ba0-ea04-4e77-9f16-05711082df93\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.481778 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-dispersionconf\") pod \"88ff5ba0-ea04-4e77-9f16-05711082df93\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.481814 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbcml\" (UniqueName: \"kubernetes.io/projected/88ff5ba0-ea04-4e77-9f16-05711082df93-kube-api-access-fbcml\") pod \"88ff5ba0-ea04-4e77-9f16-05711082df93\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.481850 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88ff5ba0-ea04-4e77-9f16-05711082df93-scripts\") pod \"88ff5ba0-ea04-4e77-9f16-05711082df93\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.481870 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-combined-ca-bundle\") pod \"88ff5ba0-ea04-4e77-9f16-05711082df93\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.481912 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/88ff5ba0-ea04-4e77-9f16-05711082df93-etc-swift\") pod \"88ff5ba0-ea04-4e77-9f16-05711082df93\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.481953 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-swiftconf\") pod \"88ff5ba0-ea04-4e77-9f16-05711082df93\" (UID: \"88ff5ba0-ea04-4e77-9f16-05711082df93\") " Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.481980 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ndgsx-config-f6qp6"] Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.482945 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88ff5ba0-ea04-4e77-9f16-05711082df93-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "88ff5ba0-ea04-4e77-9f16-05711082df93" (UID: "88ff5ba0-ea04-4e77-9f16-05711082df93"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.483121 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ff5ba0-ea04-4e77-9f16-05711082df93-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "88ff5ba0-ea04-4e77-9f16-05711082df93" (UID: "88ff5ba0-ea04-4e77-9f16-05711082df93"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.486938 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ndgsx-config-f6qp6"] Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.497116 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ff5ba0-ea04-4e77-9f16-05711082df93-kube-api-access-fbcml" (OuterVolumeSpecName: "kube-api-access-fbcml") pod "88ff5ba0-ea04-4e77-9f16-05711082df93" (UID: "88ff5ba0-ea04-4e77-9f16-05711082df93"). InnerVolumeSpecName "kube-api-access-fbcml". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.500182 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "88ff5ba0-ea04-4e77-9f16-05711082df93" (UID: "88ff5ba0-ea04-4e77-9f16-05711082df93"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.506281 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88ff5ba0-ea04-4e77-9f16-05711082df93" (UID: "88ff5ba0-ea04-4e77-9f16-05711082df93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.506711 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "88ff5ba0-ea04-4e77-9f16-05711082df93" (UID: "88ff5ba0-ea04-4e77-9f16-05711082df93"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.511641 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88ff5ba0-ea04-4e77-9f16-05711082df93-scripts" (OuterVolumeSpecName: "scripts") pod "88ff5ba0-ea04-4e77-9f16-05711082df93" (UID: "88ff5ba0-ea04-4e77-9f16-05711082df93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.584094 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88ff5ba0-ea04-4e77-9f16-05711082df93-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.584137 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.584149 4693 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/88ff5ba0-ea04-4e77-9f16-05711082df93-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.584168 4693 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.584179 4693 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/88ff5ba0-ea04-4e77-9f16-05711082df93-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.584191 4693 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/88ff5ba0-ea04-4e77-9f16-05711082df93-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.584202 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbcml\" (UniqueName: \"kubernetes.io/projected/88ff5ba0-ea04-4e77-9f16-05711082df93-kube-api-access-fbcml\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:14 crc kubenswrapper[4693]: I1125 12:26:14.824892 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ff1a2e3-b0d9-4895-b93b-e71b84ba7961" path="/var/lib/kubelet/pods/3ff1a2e3-b0d9-4895-b93b-e71b84ba7961/volumes" Nov 25 12:26:15 crc kubenswrapper[4693]: I1125 12:26:15.098947 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2kzct" event={"ID":"88ff5ba0-ea04-4e77-9f16-05711082df93","Type":"ContainerDied","Data":"49660b86c19a4db186f0439fde95ea90ddc8c195a1ec834b6d6ad4052b7c9a97"} Nov 25 12:26:15 crc kubenswrapper[4693]: I1125 12:26:15.098996 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49660b86c19a4db186f0439fde95ea90ddc8c195a1ec834b6d6ad4052b7c9a97" Nov 25 12:26:15 crc kubenswrapper[4693]: I1125 12:26:15.099290 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2kzct" Nov 25 12:26:17 crc kubenswrapper[4693]: I1125 12:26:17.121956 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:26:17 crc kubenswrapper[4693]: I1125 12:26:17.128431 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c8b28a97-55d7-41b0-aa09-55e4e132bd64-etc-swift\") pod \"swift-storage-0\" (UID: \"c8b28a97-55d7-41b0-aa09-55e4e132bd64\") " pod="openstack/swift-storage-0" Nov 25 12:26:17 crc kubenswrapper[4693]: I1125 12:26:17.287877 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 25 12:26:17 crc kubenswrapper[4693]: I1125 12:26:17.818005 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 25 12:26:18 crc kubenswrapper[4693]: I1125 12:26:18.124130 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"43cb913fb828303b3aee7932cb8449a7af26c2de52eda1146afa284eff0fc4ec"} Nov 25 12:26:18 crc kubenswrapper[4693]: I1125 12:26:18.825083 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.199843 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.239724 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-b4hnc"] Nov 25 12:26:19 crc kubenswrapper[4693]: E1125 12:26:19.240204 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff1a2e3-b0d9-4895-b93b-e71b84ba7961" containerName="ovn-config" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.240223 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff1a2e3-b0d9-4895-b93b-e71b84ba7961" containerName="ovn-config" Nov 25 12:26:19 crc kubenswrapper[4693]: E1125 12:26:19.240243 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ff5ba0-ea04-4e77-9f16-05711082df93" containerName="swift-ring-rebalance" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.240250 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ff5ba0-ea04-4e77-9f16-05711082df93" containerName="swift-ring-rebalance" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.240464 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ff5ba0-ea04-4e77-9f16-05711082df93" containerName="swift-ring-rebalance" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.240490 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff1a2e3-b0d9-4895-b93b-e71b84ba7961" containerName="ovn-config" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.241110 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-b4hnc" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.266088 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2aca-account-create-7xdtr"] Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.267213 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2aca-account-create-7xdtr" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.274684 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.275080 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2aca-account-create-7xdtr"] Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.282128 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-b4hnc"] Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.358893 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4mp8\" (UniqueName: \"kubernetes.io/projected/ca95ad70-7560-4759-9300-7e291663c116-kube-api-access-s4mp8\") pod \"cinder-db-create-b4hnc\" (UID: \"ca95ad70-7560-4759-9300-7e291663c116\") " pod="openstack/cinder-db-create-b4hnc" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.358979 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca95ad70-7560-4759-9300-7e291663c116-operator-scripts\") pod \"cinder-db-create-b4hnc\" (UID: \"ca95ad70-7560-4759-9300-7e291663c116\") " pod="openstack/cinder-db-create-b4hnc" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.359057 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d575f9b6-439e-4e0d-b056-3dbcfa35c81d-operator-scripts\") pod \"cinder-2aca-account-create-7xdtr\" (UID: \"d575f9b6-439e-4e0d-b056-3dbcfa35c81d\") " pod="openstack/cinder-2aca-account-create-7xdtr" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.359103 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qslm\" (UniqueName: \"kubernetes.io/projected/d575f9b6-439e-4e0d-b056-3dbcfa35c81d-kube-api-access-2qslm\") pod \"cinder-2aca-account-create-7xdtr\" (UID: \"d575f9b6-439e-4e0d-b056-3dbcfa35c81d\") " pod="openstack/cinder-2aca-account-create-7xdtr" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.361310 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-tmcg9"] Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.368188 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-tmcg9"] Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.368295 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tmcg9" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.416687 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e13d-account-create-2vllw"] Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.417855 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e13d-account-create-2vllw" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.426276 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.435186 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e13d-account-create-2vllw"] Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.460645 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4mp8\" (UniqueName: \"kubernetes.io/projected/ca95ad70-7560-4759-9300-7e291663c116-kube-api-access-s4mp8\") pod \"cinder-db-create-b4hnc\" (UID: \"ca95ad70-7560-4759-9300-7e291663c116\") " pod="openstack/cinder-db-create-b4hnc" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.460728 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx8s4\" (UniqueName: \"kubernetes.io/projected/bce6219d-3f7d-4731-a330-13d3dd9cb3c7-kube-api-access-hx8s4\") pod \"neutron-e13d-account-create-2vllw\" (UID: \"bce6219d-3f7d-4731-a330-13d3dd9cb3c7\") " pod="openstack/neutron-e13d-account-create-2vllw" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.460791 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca95ad70-7560-4759-9300-7e291663c116-operator-scripts\") pod \"cinder-db-create-b4hnc\" (UID: \"ca95ad70-7560-4759-9300-7e291663c116\") " pod="openstack/cinder-db-create-b4hnc" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.460889 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d575f9b6-439e-4e0d-b056-3dbcfa35c81d-operator-scripts\") pod \"cinder-2aca-account-create-7xdtr\" (UID: \"d575f9b6-439e-4e0d-b056-3dbcfa35c81d\") " pod="openstack/cinder-2aca-account-create-7xdtr" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.460920 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6w7t\" (UniqueName: \"kubernetes.io/projected/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1-kube-api-access-q6w7t\") pod \"barbican-db-create-tmcg9\" (UID: \"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1\") " pod="openstack/barbican-db-create-tmcg9" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.460967 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qslm\" (UniqueName: \"kubernetes.io/projected/d575f9b6-439e-4e0d-b056-3dbcfa35c81d-kube-api-access-2qslm\") pod \"cinder-2aca-account-create-7xdtr\" (UID: \"d575f9b6-439e-4e0d-b056-3dbcfa35c81d\") " pod="openstack/cinder-2aca-account-create-7xdtr" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.460993 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce6219d-3f7d-4731-a330-13d3dd9cb3c7-operator-scripts\") pod \"neutron-e13d-account-create-2vllw\" (UID: \"bce6219d-3f7d-4731-a330-13d3dd9cb3c7\") " pod="openstack/neutron-e13d-account-create-2vllw" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.461558 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1-operator-scripts\") pod \"barbican-db-create-tmcg9\" (UID: \"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1\") " pod="openstack/barbican-db-create-tmcg9" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.462048 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d575f9b6-439e-4e0d-b056-3dbcfa35c81d-operator-scripts\") pod \"cinder-2aca-account-create-7xdtr\" (UID: \"d575f9b6-439e-4e0d-b056-3dbcfa35c81d\") " pod="openstack/cinder-2aca-account-create-7xdtr" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.462635 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca95ad70-7560-4759-9300-7e291663c116-operator-scripts\") pod \"cinder-db-create-b4hnc\" (UID: \"ca95ad70-7560-4759-9300-7e291663c116\") " pod="openstack/cinder-db-create-b4hnc" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.491985 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qslm\" (UniqueName: \"kubernetes.io/projected/d575f9b6-439e-4e0d-b056-3dbcfa35c81d-kube-api-access-2qslm\") pod \"cinder-2aca-account-create-7xdtr\" (UID: \"d575f9b6-439e-4e0d-b056-3dbcfa35c81d\") " pod="openstack/cinder-2aca-account-create-7xdtr" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.493032 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4mp8\" (UniqueName: \"kubernetes.io/projected/ca95ad70-7560-4759-9300-7e291663c116-kube-api-access-s4mp8\") pod \"cinder-db-create-b4hnc\" (UID: \"ca95ad70-7560-4759-9300-7e291663c116\") " pod="openstack/cinder-db-create-b4hnc" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.513983 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-7rwc8"] Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.515244 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.519009 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.519138 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.519185 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.519417 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ghd6w" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.556447 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7rwc8"] Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.563083 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1-operator-scripts\") pod \"barbican-db-create-tmcg9\" (UID: \"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1\") " pod="openstack/barbican-db-create-tmcg9" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.563155 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cqzg\" (UniqueName: \"kubernetes.io/projected/0fb56062-ca4e-44f8-b5a1-af139e355d6e-kube-api-access-5cqzg\") pod \"keystone-db-sync-7rwc8\" (UID: \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\") " pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.563207 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8s4\" (UniqueName: \"kubernetes.io/projected/bce6219d-3f7d-4731-a330-13d3dd9cb3c7-kube-api-access-hx8s4\") pod \"neutron-e13d-account-create-2vllw\" (UID: \"bce6219d-3f7d-4731-a330-13d3dd9cb3c7\") " pod="openstack/neutron-e13d-account-create-2vllw" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.563241 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb56062-ca4e-44f8-b5a1-af139e355d6e-combined-ca-bundle\") pod \"keystone-db-sync-7rwc8\" (UID: \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\") " pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.563281 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb56062-ca4e-44f8-b5a1-af139e355d6e-config-data\") pod \"keystone-db-sync-7rwc8\" (UID: \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\") " pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.563304 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6w7t\" (UniqueName: \"kubernetes.io/projected/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1-kube-api-access-q6w7t\") pod \"barbican-db-create-tmcg9\" (UID: \"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1\") " pod="openstack/barbican-db-create-tmcg9" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.563356 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce6219d-3f7d-4731-a330-13d3dd9cb3c7-operator-scripts\") pod \"neutron-e13d-account-create-2vllw\" (UID: \"bce6219d-3f7d-4731-a330-13d3dd9cb3c7\") " pod="openstack/neutron-e13d-account-create-2vllw" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.564108 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce6219d-3f7d-4731-a330-13d3dd9cb3c7-operator-scripts\") pod \"neutron-e13d-account-create-2vllw\" (UID: \"bce6219d-3f7d-4731-a330-13d3dd9cb3c7\") " pod="openstack/neutron-e13d-account-create-2vllw" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.564926 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1-operator-scripts\") pod \"barbican-db-create-tmcg9\" (UID: \"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1\") " pod="openstack/barbican-db-create-tmcg9" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.584151 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8s4\" (UniqueName: \"kubernetes.io/projected/bce6219d-3f7d-4731-a330-13d3dd9cb3c7-kube-api-access-hx8s4\") pod \"neutron-e13d-account-create-2vllw\" (UID: \"bce6219d-3f7d-4731-a330-13d3dd9cb3c7\") " pod="openstack/neutron-e13d-account-create-2vllw" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.588586 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6w7t\" (UniqueName: \"kubernetes.io/projected/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1-kube-api-access-q6w7t\") pod \"barbican-db-create-tmcg9\" (UID: \"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1\") " pod="openstack/barbican-db-create-tmcg9" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.589771 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2aca-account-create-7xdtr" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.590134 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-b4hnc" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.632845 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2wcm4"] Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.636121 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2wcm4" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.846449 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tmcg9" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.847027 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e13d-account-create-2vllw" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.848986 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb56062-ca4e-44f8-b5a1-af139e355d6e-config-data\") pod \"keystone-db-sync-7rwc8\" (UID: \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\") " pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.849072 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvvhd\" (UniqueName: \"kubernetes.io/projected/8be65a28-a340-4458-afba-068603fb0ec1-kube-api-access-rvvhd\") pod \"neutron-db-create-2wcm4\" (UID: \"8be65a28-a340-4458-afba-068603fb0ec1\") " pod="openstack/neutron-db-create-2wcm4" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.849104 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8be65a28-a340-4458-afba-068603fb0ec1-operator-scripts\") pod \"neutron-db-create-2wcm4\" (UID: \"8be65a28-a340-4458-afba-068603fb0ec1\") " pod="openstack/neutron-db-create-2wcm4" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.849155 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cqzg\" (UniqueName: \"kubernetes.io/projected/0fb56062-ca4e-44f8-b5a1-af139e355d6e-kube-api-access-5cqzg\") pod \"keystone-db-sync-7rwc8\" (UID: \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\") " pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.849220 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb56062-ca4e-44f8-b5a1-af139e355d6e-combined-ca-bundle\") pod \"keystone-db-sync-7rwc8\" (UID: \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\") " pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.852669 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb56062-ca4e-44f8-b5a1-af139e355d6e-combined-ca-bundle\") pod \"keystone-db-sync-7rwc8\" (UID: \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\") " pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.858841 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb56062-ca4e-44f8-b5a1-af139e355d6e-config-data\") pod \"keystone-db-sync-7rwc8\" (UID: \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\") " pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.882604 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cqzg\" (UniqueName: \"kubernetes.io/projected/0fb56062-ca4e-44f8-b5a1-af139e355d6e-kube-api-access-5cqzg\") pod \"keystone-db-sync-7rwc8\" (UID: \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\") " pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.951582 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvvhd\" (UniqueName: \"kubernetes.io/projected/8be65a28-a340-4458-afba-068603fb0ec1-kube-api-access-rvvhd\") pod \"neutron-db-create-2wcm4\" (UID: \"8be65a28-a340-4458-afba-068603fb0ec1\") " pod="openstack/neutron-db-create-2wcm4" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.951899 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8be65a28-a340-4458-afba-068603fb0ec1-operator-scripts\") pod \"neutron-db-create-2wcm4\" (UID: \"8be65a28-a340-4458-afba-068603fb0ec1\") " pod="openstack/neutron-db-create-2wcm4" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.952995 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8be65a28-a340-4458-afba-068603fb0ec1-operator-scripts\") pod \"neutron-db-create-2wcm4\" (UID: \"8be65a28-a340-4458-afba-068603fb0ec1\") " pod="openstack/neutron-db-create-2wcm4" Nov 25 12:26:19 crc kubenswrapper[4693]: I1125 12:26:19.953084 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:19.962366 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2wcm4"] Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.024436 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvvhd\" (UniqueName: \"kubernetes.io/projected/8be65a28-a340-4458-afba-068603fb0ec1-kube-api-access-rvvhd\") pod \"neutron-db-create-2wcm4\" (UID: \"8be65a28-a340-4458-afba-068603fb0ec1\") " pod="openstack/neutron-db-create-2wcm4" Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.142564 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6aa6-account-create-8z45k"] Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.148850 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6aa6-account-create-8z45k" Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.153131 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6aa6-account-create-8z45k"] Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.154638 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.155549 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d51f839-d354-4e68-91b1-63cdb3f3c3fa-operator-scripts\") pod \"barbican-6aa6-account-create-8z45k\" (UID: \"6d51f839-d354-4e68-91b1-63cdb3f3c3fa\") " pod="openstack/barbican-6aa6-account-create-8z45k" Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.155618 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bszj4\" (UniqueName: \"kubernetes.io/projected/6d51f839-d354-4e68-91b1-63cdb3f3c3fa-kube-api-access-bszj4\") pod \"barbican-6aa6-account-create-8z45k\" (UID: \"6d51f839-d354-4e68-91b1-63cdb3f3c3fa\") " pod="openstack/barbican-6aa6-account-create-8z45k" Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.256387 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d51f839-d354-4e68-91b1-63cdb3f3c3fa-operator-scripts\") pod \"barbican-6aa6-account-create-8z45k\" (UID: \"6d51f839-d354-4e68-91b1-63cdb3f3c3fa\") " pod="openstack/barbican-6aa6-account-create-8z45k" Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.256471 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bszj4\" (UniqueName: \"kubernetes.io/projected/6d51f839-d354-4e68-91b1-63cdb3f3c3fa-kube-api-access-bszj4\") pod \"barbican-6aa6-account-create-8z45k\" (UID: \"6d51f839-d354-4e68-91b1-63cdb3f3c3fa\") " pod="openstack/barbican-6aa6-account-create-8z45k" Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.257197 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d51f839-d354-4e68-91b1-63cdb3f3c3fa-operator-scripts\") pod \"barbican-6aa6-account-create-8z45k\" (UID: \"6d51f839-d354-4e68-91b1-63cdb3f3c3fa\") " pod="openstack/barbican-6aa6-account-create-8z45k" Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.263353 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2wcm4" Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.273838 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bszj4\" (UniqueName: \"kubernetes.io/projected/6d51f839-d354-4e68-91b1-63cdb3f3c3fa-kube-api-access-bszj4\") pod \"barbican-6aa6-account-create-8z45k\" (UID: \"6d51f839-d354-4e68-91b1-63cdb3f3c3fa\") " pod="openstack/barbican-6aa6-account-create-8z45k" Nov 25 12:26:20 crc kubenswrapper[4693]: I1125 12:26:20.466860 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6aa6-account-create-8z45k" Nov 25 12:26:27 crc kubenswrapper[4693]: E1125 12:26:27.947328 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:26bd7b0bd6070856aefef6fe754c547d55c056396ea30d879d34c2d49b5a1d29" Nov 25 12:26:27 crc kubenswrapper[4693]: E1125 12:26:27.947919 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:26bd7b0bd6070856aefef6fe754c547d55c056396ea30d879d34c2d49b5a1d29,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vfxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-qdmcl_openstack(e8135cf2-4e92-4e70-9c47-f5fae388c0be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:26:27 crc kubenswrapper[4693]: E1125 12:26:27.949107 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-qdmcl" podUID="e8135cf2-4e92-4e70-9c47-f5fae388c0be" Nov 25 12:26:28 crc kubenswrapper[4693]: E1125 12:26:28.255950 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:26bd7b0bd6070856aefef6fe754c547d55c056396ea30d879d34c2d49b5a1d29\\\"\"" pod="openstack/glance-db-sync-qdmcl" podUID="e8135cf2-4e92-4e70-9c47-f5fae388c0be" Nov 25 12:26:28 crc kubenswrapper[4693]: I1125 12:26:28.509555 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7rwc8"] Nov 25 12:26:28 crc kubenswrapper[4693]: I1125 12:26:28.629309 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-b4hnc"] Nov 25 12:26:28 crc kubenswrapper[4693]: I1125 12:26:28.638598 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-tmcg9"] Nov 25 12:26:28 crc kubenswrapper[4693]: I1125 12:26:28.647858 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6aa6-account-create-8z45k"] Nov 25 12:26:28 crc kubenswrapper[4693]: I1125 12:26:28.654201 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2wcm4"] Nov 25 12:26:28 crc kubenswrapper[4693]: W1125 12:26:28.761979 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdacf3ab_72e9_4ed4_95ca_a03db2c1c5a1.slice/crio-06a1ff465151378138bb841b396808fe52dec46d9cafb23da234301d8c63f2dc WatchSource:0}: Error finding container 06a1ff465151378138bb841b396808fe52dec46d9cafb23da234301d8c63f2dc: Status 404 returned error can't find the container with id 06a1ff465151378138bb841b396808fe52dec46d9cafb23da234301d8c63f2dc Nov 25 12:26:28 crc kubenswrapper[4693]: W1125 12:26:28.766201 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8be65a28_a340_4458_afba_068603fb0ec1.slice/crio-443be996b5d588d56e6aff9519b099593ef086fd6dab73c3fccb93719d45ce2c WatchSource:0}: Error finding container 443be996b5d588d56e6aff9519b099593ef086fd6dab73c3fccb93719d45ce2c: Status 404 returned error can't find the container with id 443be996b5d588d56e6aff9519b099593ef086fd6dab73c3fccb93719d45ce2c Nov 25 12:26:28 crc kubenswrapper[4693]: W1125 12:26:28.768328 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d51f839_d354_4e68_91b1_63cdb3f3c3fa.slice/crio-a1952de01167734a2d673287cea3f129a22b0b83cfe63b77433e06413636a94e WatchSource:0}: Error finding container a1952de01167734a2d673287cea3f129a22b0b83cfe63b77433e06413636a94e: Status 404 returned error can't find the container with id a1952de01167734a2d673287cea3f129a22b0b83cfe63b77433e06413636a94e Nov 25 12:26:28 crc kubenswrapper[4693]: W1125 12:26:28.774148 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fb56062_ca4e_44f8_b5a1_af139e355d6e.slice/crio-e2da903d62e8ecfab71da58157f4c061c225bc78a5f59a8da738d0cfcd6d5c69 WatchSource:0}: Error finding container e2da903d62e8ecfab71da58157f4c061c225bc78a5f59a8da738d0cfcd6d5c69: Status 404 returned error can't find the container with id e2da903d62e8ecfab71da58157f4c061c225bc78a5f59a8da738d0cfcd6d5c69 Nov 25 12:26:28 crc kubenswrapper[4693]: W1125 12:26:28.776570 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca95ad70_7560_4759_9300_7e291663c116.slice/crio-f3cb861c4684edf4003ff416c981938748eb4ca11c3cfc9688f3a2c7265b461f WatchSource:0}: Error finding container f3cb861c4684edf4003ff416c981938748eb4ca11c3cfc9688f3a2c7265b461f: Status 404 returned error can't find the container with id f3cb861c4684edf4003ff416c981938748eb4ca11c3cfc9688f3a2c7265b461f Nov 25 12:26:28 crc kubenswrapper[4693]: I1125 12:26:28.788850 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e13d-account-create-2vllw"] Nov 25 12:26:28 crc kubenswrapper[4693]: W1125 12:26:28.806277 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbce6219d_3f7d_4731_a330_13d3dd9cb3c7.slice/crio-9198da14e6a9bf63c848cb9d4badf796847173978597f42b8396696ed81a0cb3 WatchSource:0}: Error finding container 9198da14e6a9bf63c848cb9d4badf796847173978597f42b8396696ed81a0cb3: Status 404 returned error can't find the container with id 9198da14e6a9bf63c848cb9d4badf796847173978597f42b8396696ed81a0cb3 Nov 25 12:26:28 crc kubenswrapper[4693]: I1125 12:26:28.846097 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2aca-account-create-7xdtr"] Nov 25 12:26:28 crc kubenswrapper[4693]: W1125 12:26:28.875003 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd575f9b6_439e_4e0d_b056_3dbcfa35c81d.slice/crio-8041318ffaa592c949f3bebaf1c6ae663f18e977aa192810f1192213efdce621 WatchSource:0}: Error finding container 8041318ffaa592c949f3bebaf1c6ae663f18e977aa192810f1192213efdce621: Status 404 returned error can't find the container with id 8041318ffaa592c949f3bebaf1c6ae663f18e977aa192810f1192213efdce621 Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.236348 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2aca-account-create-7xdtr" event={"ID":"d575f9b6-439e-4e0d-b056-3dbcfa35c81d","Type":"ContainerStarted","Data":"1862438b965a6043f320ee9cc1d6fc04dedd0096a23736d0861143ad8e795db7"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.236418 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2aca-account-create-7xdtr" event={"ID":"d575f9b6-439e-4e0d-b056-3dbcfa35c81d","Type":"ContainerStarted","Data":"8041318ffaa592c949f3bebaf1c6ae663f18e977aa192810f1192213efdce621"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.238747 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e13d-account-create-2vllw" event={"ID":"bce6219d-3f7d-4731-a330-13d3dd9cb3c7","Type":"ContainerStarted","Data":"73d4401a25153a0144e65825b7545fd97acaf2e971e5530eb2ca3b649d96b144"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.238791 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e13d-account-create-2vllw" event={"ID":"bce6219d-3f7d-4731-a330-13d3dd9cb3c7","Type":"ContainerStarted","Data":"9198da14e6a9bf63c848cb9d4badf796847173978597f42b8396696ed81a0cb3"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.242691 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tmcg9" event={"ID":"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1","Type":"ContainerStarted","Data":"0365948fea35c09bc61b609f21b4bd4fa4b216b01ad47b8bde568769ca757804"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.242758 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tmcg9" event={"ID":"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1","Type":"ContainerStarted","Data":"06a1ff465151378138bb841b396808fe52dec46d9cafb23da234301d8c63f2dc"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.248076 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2wcm4" event={"ID":"8be65a28-a340-4458-afba-068603fb0ec1","Type":"ContainerStarted","Data":"62c8ebe60a6f8e52b3547b1b87ec95eec93a20dd3b170b2c63d94d7be81c3468"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.248123 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2wcm4" event={"ID":"8be65a28-a340-4458-afba-068603fb0ec1","Type":"ContainerStarted","Data":"443be996b5d588d56e6aff9519b099593ef086fd6dab73c3fccb93719d45ce2c"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.257414 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6aa6-account-create-8z45k" event={"ID":"6d51f839-d354-4e68-91b1-63cdb3f3c3fa","Type":"ContainerStarted","Data":"133c9d55f230006c7069a4e502a0f5c448634ae0174e3da97df578cb288f2abd"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.257476 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6aa6-account-create-8z45k" event={"ID":"6d51f839-d354-4e68-91b1-63cdb3f3c3fa","Type":"ContainerStarted","Data":"a1952de01167734a2d673287cea3f129a22b0b83cfe63b77433e06413636a94e"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.259580 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-b4hnc" event={"ID":"ca95ad70-7560-4759-9300-7e291663c116","Type":"ContainerStarted","Data":"fa6bbeed6e3f3a7eaba951ed5b3c3ce02b321db589105d00b9ff7699b8dc6558"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.259637 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-b4hnc" event={"ID":"ca95ad70-7560-4759-9300-7e291663c116","Type":"ContainerStarted","Data":"f3cb861c4684edf4003ff416c981938748eb4ca11c3cfc9688f3a2c7265b461f"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.265139 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2aca-account-create-7xdtr" podStartSLOduration=10.265102982 podStartE2EDuration="10.265102982s" podCreationTimestamp="2025-11-25 12:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:29.25342972 +0000 UTC m=+1109.171515121" watchObservedRunningTime="2025-11-25 12:26:29.265102982 +0000 UTC m=+1109.183188363" Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.272136 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-e13d-account-create-2vllw" podStartSLOduration=10.272119681 podStartE2EDuration="10.272119681s" podCreationTimestamp="2025-11-25 12:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:29.270128984 +0000 UTC m=+1109.188214375" watchObservedRunningTime="2025-11-25 12:26:29.272119681 +0000 UTC m=+1109.190205062" Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.283638 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7rwc8" event={"ID":"0fb56062-ca4e-44f8-b5a1-af139e355d6e","Type":"ContainerStarted","Data":"e2da903d62e8ecfab71da58157f4c061c225bc78a5f59a8da738d0cfcd6d5c69"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.292960 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"baff0b391125de82811b99029ab8e8654c7419e758599228f1e3b4fe01b1f560"} Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.301911 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-tmcg9" podStartSLOduration=10.301891345 podStartE2EDuration="10.301891345s" podCreationTimestamp="2025-11-25 12:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:29.299136637 +0000 UTC m=+1109.217222018" watchObservedRunningTime="2025-11-25 12:26:29.301891345 +0000 UTC m=+1109.219976726" Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.331913 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-2wcm4" podStartSLOduration=10.331890016 podStartE2EDuration="10.331890016s" podCreationTimestamp="2025-11-25 12:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:29.322572912 +0000 UTC m=+1109.240658283" watchObservedRunningTime="2025-11-25 12:26:29.331890016 +0000 UTC m=+1109.249975397" Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.347098 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-6aa6-account-create-8z45k" podStartSLOduration=9.347079957 podStartE2EDuration="9.347079957s" podCreationTimestamp="2025-11-25 12:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:29.339542344 +0000 UTC m=+1109.257627745" watchObservedRunningTime="2025-11-25 12:26:29.347079957 +0000 UTC m=+1109.265165338" Nov 25 12:26:29 crc kubenswrapper[4693]: I1125 12:26:29.360536 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-b4hnc" podStartSLOduration=10.360516798999999 podStartE2EDuration="10.360516799s" podCreationTimestamp="2025-11-25 12:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:29.359064198 +0000 UTC m=+1109.277149569" watchObservedRunningTime="2025-11-25 12:26:29.360516799 +0000 UTC m=+1109.278602180" Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.308925 4693 generic.go:334] "Generic (PLEG): container finished" podID="8be65a28-a340-4458-afba-068603fb0ec1" containerID="62c8ebe60a6f8e52b3547b1b87ec95eec93a20dd3b170b2c63d94d7be81c3468" exitCode=0 Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.309015 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2wcm4" event={"ID":"8be65a28-a340-4458-afba-068603fb0ec1","Type":"ContainerDied","Data":"62c8ebe60a6f8e52b3547b1b87ec95eec93a20dd3b170b2c63d94d7be81c3468"} Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.314178 4693 generic.go:334] "Generic (PLEG): container finished" podID="6d51f839-d354-4e68-91b1-63cdb3f3c3fa" containerID="133c9d55f230006c7069a4e502a0f5c448634ae0174e3da97df578cb288f2abd" exitCode=0 Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.314319 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6aa6-account-create-8z45k" event={"ID":"6d51f839-d354-4e68-91b1-63cdb3f3c3fa","Type":"ContainerDied","Data":"133c9d55f230006c7069a4e502a0f5c448634ae0174e3da97df578cb288f2abd"} Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.319310 4693 generic.go:334] "Generic (PLEG): container finished" podID="ca95ad70-7560-4759-9300-7e291663c116" containerID="fa6bbeed6e3f3a7eaba951ed5b3c3ce02b321db589105d00b9ff7699b8dc6558" exitCode=0 Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.319413 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-b4hnc" event={"ID":"ca95ad70-7560-4759-9300-7e291663c116","Type":"ContainerDied","Data":"fa6bbeed6e3f3a7eaba951ed5b3c3ce02b321db589105d00b9ff7699b8dc6558"} Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.324523 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"dc670489e11fe45d9cee14e9de122c299aed32bd9bf79577f9a1b976fc74ade9"} Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.324584 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"418b07a577a01b1fd893227a74b3b0d43c6df519ab02fa60e0bb123ebded13c6"} Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.324598 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"907758ed6bb12af24167bc6ab7544e8178f4033d1a9c76dff1b46f4dee1e7ae6"} Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.341162 4693 generic.go:334] "Generic (PLEG): container finished" podID="d575f9b6-439e-4e0d-b056-3dbcfa35c81d" containerID="1862438b965a6043f320ee9cc1d6fc04dedd0096a23736d0861143ad8e795db7" exitCode=0 Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.341236 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2aca-account-create-7xdtr" event={"ID":"d575f9b6-439e-4e0d-b056-3dbcfa35c81d","Type":"ContainerDied","Data":"1862438b965a6043f320ee9cc1d6fc04dedd0096a23736d0861143ad8e795db7"} Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.344133 4693 generic.go:334] "Generic (PLEG): container finished" podID="bce6219d-3f7d-4731-a330-13d3dd9cb3c7" containerID="73d4401a25153a0144e65825b7545fd97acaf2e971e5530eb2ca3b649d96b144" exitCode=0 Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.344207 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e13d-account-create-2vllw" event={"ID":"bce6219d-3f7d-4731-a330-13d3dd9cb3c7","Type":"ContainerDied","Data":"73d4401a25153a0144e65825b7545fd97acaf2e971e5530eb2ca3b649d96b144"} Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.357741 4693 generic.go:334] "Generic (PLEG): container finished" podID="cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1" containerID="0365948fea35c09bc61b609f21b4bd4fa4b216b01ad47b8bde568769ca757804" exitCode=0 Nov 25 12:26:30 crc kubenswrapper[4693]: I1125 12:26:30.357864 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tmcg9" event={"ID":"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1","Type":"ContainerDied","Data":"0365948fea35c09bc61b609f21b4bd4fa4b216b01ad47b8bde568769ca757804"} Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.234753 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tmcg9" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.375623 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6w7t\" (UniqueName: \"kubernetes.io/projected/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1-kube-api-access-q6w7t\") pod \"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1\" (UID: \"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1\") " Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.375707 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1-operator-scripts\") pod \"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1\" (UID: \"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1\") " Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.376572 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1" (UID: "cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.384724 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1-kube-api-access-q6w7t" (OuterVolumeSpecName: "kube-api-access-q6w7t") pod "cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1" (UID: "cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1"). InnerVolumeSpecName "kube-api-access-q6w7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.411268 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tmcg9" event={"ID":"cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1","Type":"ContainerDied","Data":"06a1ff465151378138bb841b396808fe52dec46d9cafb23da234301d8c63f2dc"} Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.411563 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06a1ff465151378138bb841b396808fe52dec46d9cafb23da234301d8c63f2dc" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.411318 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tmcg9" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.480199 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6w7t\" (UniqueName: \"kubernetes.io/projected/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1-kube-api-access-q6w7t\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.480222 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.538009 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2aca-account-create-7xdtr" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.550588 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e13d-account-create-2vllw" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.571697 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2wcm4" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.580640 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8be65a28-a340-4458-afba-068603fb0ec1-operator-scripts\") pod \"8be65a28-a340-4458-afba-068603fb0ec1\" (UID: \"8be65a28-a340-4458-afba-068603fb0ec1\") " Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.580691 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce6219d-3f7d-4731-a330-13d3dd9cb3c7-operator-scripts\") pod \"bce6219d-3f7d-4731-a330-13d3dd9cb3c7\" (UID: \"bce6219d-3f7d-4731-a330-13d3dd9cb3c7\") " Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.580732 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx8s4\" (UniqueName: \"kubernetes.io/projected/bce6219d-3f7d-4731-a330-13d3dd9cb3c7-kube-api-access-hx8s4\") pod \"bce6219d-3f7d-4731-a330-13d3dd9cb3c7\" (UID: \"bce6219d-3f7d-4731-a330-13d3dd9cb3c7\") " Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.581240 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvvhd\" (UniqueName: \"kubernetes.io/projected/8be65a28-a340-4458-afba-068603fb0ec1-kube-api-access-rvvhd\") pod \"8be65a28-a340-4458-afba-068603fb0ec1\" (UID: \"8be65a28-a340-4458-afba-068603fb0ec1\") " Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.581271 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d575f9b6-439e-4e0d-b056-3dbcfa35c81d-operator-scripts\") pod \"d575f9b6-439e-4e0d-b056-3dbcfa35c81d\" (UID: \"d575f9b6-439e-4e0d-b056-3dbcfa35c81d\") " Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.581294 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qslm\" (UniqueName: \"kubernetes.io/projected/d575f9b6-439e-4e0d-b056-3dbcfa35c81d-kube-api-access-2qslm\") pod \"d575f9b6-439e-4e0d-b056-3dbcfa35c81d\" (UID: \"d575f9b6-439e-4e0d-b056-3dbcfa35c81d\") " Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.581933 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be65a28-a340-4458-afba-068603fb0ec1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8be65a28-a340-4458-afba-068603fb0ec1" (UID: "8be65a28-a340-4458-afba-068603fb0ec1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.582903 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d575f9b6-439e-4e0d-b056-3dbcfa35c81d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d575f9b6-439e-4e0d-b056-3dbcfa35c81d" (UID: "d575f9b6-439e-4e0d-b056-3dbcfa35c81d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.582979 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bce6219d-3f7d-4731-a330-13d3dd9cb3c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bce6219d-3f7d-4731-a330-13d3dd9cb3c7" (UID: "bce6219d-3f7d-4731-a330-13d3dd9cb3c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.583816 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d575f9b6-439e-4e0d-b056-3dbcfa35c81d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.583947 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8be65a28-a340-4458-afba-068603fb0ec1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.584061 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce6219d-3f7d-4731-a330-13d3dd9cb3c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.585441 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce6219d-3f7d-4731-a330-13d3dd9cb3c7-kube-api-access-hx8s4" (OuterVolumeSpecName: "kube-api-access-hx8s4") pod "bce6219d-3f7d-4731-a330-13d3dd9cb3c7" (UID: "bce6219d-3f7d-4731-a330-13d3dd9cb3c7"). InnerVolumeSpecName "kube-api-access-hx8s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.596215 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be65a28-a340-4458-afba-068603fb0ec1-kube-api-access-rvvhd" (OuterVolumeSpecName: "kube-api-access-rvvhd") pod "8be65a28-a340-4458-afba-068603fb0ec1" (UID: "8be65a28-a340-4458-afba-068603fb0ec1"). InnerVolumeSpecName "kube-api-access-rvvhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.596685 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d575f9b6-439e-4e0d-b056-3dbcfa35c81d-kube-api-access-2qslm" (OuterVolumeSpecName: "kube-api-access-2qslm") pod "d575f9b6-439e-4e0d-b056-3dbcfa35c81d" (UID: "d575f9b6-439e-4e0d-b056-3dbcfa35c81d"). InnerVolumeSpecName "kube-api-access-2qslm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.617755 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-b4hnc" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.659043 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6aa6-account-create-8z45k" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.685078 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bszj4\" (UniqueName: \"kubernetes.io/projected/6d51f839-d354-4e68-91b1-63cdb3f3c3fa-kube-api-access-bszj4\") pod \"6d51f839-d354-4e68-91b1-63cdb3f3c3fa\" (UID: \"6d51f839-d354-4e68-91b1-63cdb3f3c3fa\") " Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.685120 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4mp8\" (UniqueName: \"kubernetes.io/projected/ca95ad70-7560-4759-9300-7e291663c116-kube-api-access-s4mp8\") pod \"ca95ad70-7560-4759-9300-7e291663c116\" (UID: \"ca95ad70-7560-4759-9300-7e291663c116\") " Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.685202 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d51f839-d354-4e68-91b1-63cdb3f3c3fa-operator-scripts\") pod \"6d51f839-d354-4e68-91b1-63cdb3f3c3fa\" (UID: \"6d51f839-d354-4e68-91b1-63cdb3f3c3fa\") " Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.685328 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca95ad70-7560-4759-9300-7e291663c116-operator-scripts\") pod \"ca95ad70-7560-4759-9300-7e291663c116\" (UID: \"ca95ad70-7560-4759-9300-7e291663c116\") " Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.685614 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx8s4\" (UniqueName: \"kubernetes.io/projected/bce6219d-3f7d-4731-a330-13d3dd9cb3c7-kube-api-access-hx8s4\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.685628 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvvhd\" (UniqueName: \"kubernetes.io/projected/8be65a28-a340-4458-afba-068603fb0ec1-kube-api-access-rvvhd\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.685638 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qslm\" (UniqueName: \"kubernetes.io/projected/d575f9b6-439e-4e0d-b056-3dbcfa35c81d-kube-api-access-2qslm\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.686166 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca95ad70-7560-4759-9300-7e291663c116-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca95ad70-7560-4759-9300-7e291663c116" (UID: "ca95ad70-7560-4759-9300-7e291663c116"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.686637 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d51f839-d354-4e68-91b1-63cdb3f3c3fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d51f839-d354-4e68-91b1-63cdb3f3c3fa" (UID: "6d51f839-d354-4e68-91b1-63cdb3f3c3fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.707269 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca95ad70-7560-4759-9300-7e291663c116-kube-api-access-s4mp8" (OuterVolumeSpecName: "kube-api-access-s4mp8") pod "ca95ad70-7560-4759-9300-7e291663c116" (UID: "ca95ad70-7560-4759-9300-7e291663c116"). InnerVolumeSpecName "kube-api-access-s4mp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.707342 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d51f839-d354-4e68-91b1-63cdb3f3c3fa-kube-api-access-bszj4" (OuterVolumeSpecName: "kube-api-access-bszj4") pod "6d51f839-d354-4e68-91b1-63cdb3f3c3fa" (UID: "6d51f839-d354-4e68-91b1-63cdb3f3c3fa"). InnerVolumeSpecName "kube-api-access-bszj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.787044 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d51f839-d354-4e68-91b1-63cdb3f3c3fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.787347 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca95ad70-7560-4759-9300-7e291663c116-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.787502 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bszj4\" (UniqueName: \"kubernetes.io/projected/6d51f839-d354-4e68-91b1-63cdb3f3c3fa-kube-api-access-bszj4\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:36 crc kubenswrapper[4693]: I1125 12:26:36.787598 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4mp8\" (UniqueName: \"kubernetes.io/projected/ca95ad70-7560-4759-9300-7e291663c116-kube-api-access-s4mp8\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.419243 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2aca-account-create-7xdtr" event={"ID":"d575f9b6-439e-4e0d-b056-3dbcfa35c81d","Type":"ContainerDied","Data":"8041318ffaa592c949f3bebaf1c6ae663f18e977aa192810f1192213efdce621"} Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.419570 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8041318ffaa592c949f3bebaf1c6ae663f18e977aa192810f1192213efdce621" Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.419265 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2aca-account-create-7xdtr" Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.421333 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e13d-account-create-2vllw" event={"ID":"bce6219d-3f7d-4731-a330-13d3dd9cb3c7","Type":"ContainerDied","Data":"9198da14e6a9bf63c848cb9d4badf796847173978597f42b8396696ed81a0cb3"} Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.421392 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9198da14e6a9bf63c848cb9d4badf796847173978597f42b8396696ed81a0cb3" Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.421464 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e13d-account-create-2vllw" Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.423424 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2wcm4" event={"ID":"8be65a28-a340-4458-afba-068603fb0ec1","Type":"ContainerDied","Data":"443be996b5d588d56e6aff9519b099593ef086fd6dab73c3fccb93719d45ce2c"} Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.423458 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="443be996b5d588d56e6aff9519b099593ef086fd6dab73c3fccb93719d45ce2c" Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.423522 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2wcm4" Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.425430 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6aa6-account-create-8z45k" event={"ID":"6d51f839-d354-4e68-91b1-63cdb3f3c3fa","Type":"ContainerDied","Data":"a1952de01167734a2d673287cea3f129a22b0b83cfe63b77433e06413636a94e"} Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.425450 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1952de01167734a2d673287cea3f129a22b0b83cfe63b77433e06413636a94e" Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.425478 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6aa6-account-create-8z45k" Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.428282 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-b4hnc" Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.428273 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-b4hnc" event={"ID":"ca95ad70-7560-4759-9300-7e291663c116","Type":"ContainerDied","Data":"f3cb861c4684edf4003ff416c981938748eb4ca11c3cfc9688f3a2c7265b461f"} Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.428405 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3cb861c4684edf4003ff416c981938748eb4ca11c3cfc9688f3a2c7265b461f" Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.430184 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7rwc8" event={"ID":"0fb56062-ca4e-44f8-b5a1-af139e355d6e","Type":"ContainerStarted","Data":"a1a68526476f980a8923d3ed8ce10a849f044ad1b93f604a4317d2a03b195116"} Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.440905 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"ae0e3bd3ddad9dadc5c9981945f41b1e13deadf61bf168d38f2e5f6769eeecbd"} Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.440948 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"f60256b3edec9d4df163badb92bf375a098e32d7fe8da38741764131b7ceedf9"} Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.440958 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"23e70c23cd067df6cc57c4131c08e899318ca2a5c6abf6911b6ab5993f66c804"} Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.440969 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"c7e22091d8f87429d2c545799cc65458948742dc3a8300351a1be36f04426101"} Nov 25 12:26:37 crc kubenswrapper[4693]: I1125 12:26:37.450533 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-7rwc8" podStartSLOduration=10.773251249 podStartE2EDuration="18.450509148s" podCreationTimestamp="2025-11-25 12:26:19 +0000 UTC" firstStartedPulling="2025-11-25 12:26:28.800203311 +0000 UTC m=+1108.718288692" lastFinishedPulling="2025-11-25 12:26:36.47746121 +0000 UTC m=+1116.395546591" observedRunningTime="2025-11-25 12:26:37.443746616 +0000 UTC m=+1117.361831997" watchObservedRunningTime="2025-11-25 12:26:37.450509148 +0000 UTC m=+1117.368594529" Nov 25 12:26:39 crc kubenswrapper[4693]: I1125 12:26:39.472142 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"b725123cb8b5d3ecd2edbf8d15362507762afe7e0d3084504ec7cd1aaf8f00b1"} Nov 25 12:26:39 crc kubenswrapper[4693]: I1125 12:26:39.472699 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"de7c93bf0c3b42c0455504f3dc022f74bd1d49dfca9571857aea1b4f2e5514a4"} Nov 25 12:26:39 crc kubenswrapper[4693]: I1125 12:26:39.472710 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"5f98692cd14e05ad7ecf5b3578b6d820254c47997676e3f35e37b6443a0686a7"} Nov 25 12:26:39 crc kubenswrapper[4693]: I1125 12:26:39.472719 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"932ac9ffdfc900eb9d44a3bfa6d90e72043d944b0f1c308da1f2354488ed493b"} Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.481670 4693 generic.go:334] "Generic (PLEG): container finished" podID="0fb56062-ca4e-44f8-b5a1-af139e355d6e" containerID="a1a68526476f980a8923d3ed8ce10a849f044ad1b93f604a4317d2a03b195116" exitCode=0 Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.481752 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7rwc8" event={"ID":"0fb56062-ca4e-44f8-b5a1-af139e355d6e","Type":"ContainerDied","Data":"a1a68526476f980a8923d3ed8ce10a849f044ad1b93f604a4317d2a03b195116"} Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.492532 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"daaa54bfa73bd8ddf187780da9415bf9c0d5aa113048a671652c8ae5f1e039d1"} Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.492803 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"c795321a61100b8891daf77072aef7243958b0667d1e6e18fd2065992dea550a"} Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.492875 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c8b28a97-55d7-41b0-aa09-55e4e132bd64","Type":"ContainerStarted","Data":"fe2de0de31ab5aed7d02de3a26eabef6cd4e3d3d6873a81391dfe8b9cd45f971"} Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.535306 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.951116913999996 podStartE2EDuration="56.535285203s" podCreationTimestamp="2025-11-25 12:25:44 +0000 UTC" firstStartedPulling="2025-11-25 12:26:17.828944901 +0000 UTC m=+1097.747030282" lastFinishedPulling="2025-11-25 12:26:38.41311319 +0000 UTC m=+1118.331198571" observedRunningTime="2025-11-25 12:26:40.529800918 +0000 UTC m=+1120.447886309" watchObservedRunningTime="2025-11-25 12:26:40.535285203 +0000 UTC m=+1120.453370584" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.795290 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86c887b9fc-r66cj"] Nov 25 12:26:40 crc kubenswrapper[4693]: E1125 12:26:40.795887 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d575f9b6-439e-4e0d-b056-3dbcfa35c81d" containerName="mariadb-account-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.795905 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d575f9b6-439e-4e0d-b056-3dbcfa35c81d" containerName="mariadb-account-create" Nov 25 12:26:40 crc kubenswrapper[4693]: E1125 12:26:40.795917 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca95ad70-7560-4759-9300-7e291663c116" containerName="mariadb-database-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.795924 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca95ad70-7560-4759-9300-7e291663c116" containerName="mariadb-database-create" Nov 25 12:26:40 crc kubenswrapper[4693]: E1125 12:26:40.795940 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce6219d-3f7d-4731-a330-13d3dd9cb3c7" containerName="mariadb-account-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.795947 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce6219d-3f7d-4731-a330-13d3dd9cb3c7" containerName="mariadb-account-create" Nov 25 12:26:40 crc kubenswrapper[4693]: E1125 12:26:40.795958 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d51f839-d354-4e68-91b1-63cdb3f3c3fa" containerName="mariadb-account-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.795966 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d51f839-d354-4e68-91b1-63cdb3f3c3fa" containerName="mariadb-account-create" Nov 25 12:26:40 crc kubenswrapper[4693]: E1125 12:26:40.795989 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1" containerName="mariadb-database-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.795999 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1" containerName="mariadb-database-create" Nov 25 12:26:40 crc kubenswrapper[4693]: E1125 12:26:40.796013 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be65a28-a340-4458-afba-068603fb0ec1" containerName="mariadb-database-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.796019 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be65a28-a340-4458-afba-068603fb0ec1" containerName="mariadb-database-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.796177 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1" containerName="mariadb-database-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.796196 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca95ad70-7560-4759-9300-7e291663c116" containerName="mariadb-database-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.796211 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d575f9b6-439e-4e0d-b056-3dbcfa35c81d" containerName="mariadb-account-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.796220 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d51f839-d354-4e68-91b1-63cdb3f3c3fa" containerName="mariadb-account-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.796229 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce6219d-3f7d-4731-a330-13d3dd9cb3c7" containerName="mariadb-account-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.796241 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be65a28-a340-4458-afba-068603fb0ec1" containerName="mariadb-database-create" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.797138 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.800699 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.809287 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86c887b9fc-r66cj"] Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.951337 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-config\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.951454 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-dns-swift-storage-0\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.951521 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-ovsdbserver-nb\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.951694 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntqdc\" (UniqueName: \"kubernetes.io/projected/f1a4c06a-2f49-4d49-9bd1-387e002c7922-kube-api-access-ntqdc\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.951725 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-dns-svc\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:40 crc kubenswrapper[4693]: I1125 12:26:40.951747 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-ovsdbserver-sb\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.053424 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntqdc\" (UniqueName: \"kubernetes.io/projected/f1a4c06a-2f49-4d49-9bd1-387e002c7922-kube-api-access-ntqdc\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.053477 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-dns-svc\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.053498 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-ovsdbserver-sb\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.053545 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-config\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.053588 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-dns-swift-storage-0\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.053629 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-ovsdbserver-nb\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.054490 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-ovsdbserver-nb\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.055182 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-config\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.055543 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-dns-swift-storage-0\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.055639 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-ovsdbserver-sb\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.055743 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-dns-svc\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.079584 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntqdc\" (UniqueName: \"kubernetes.io/projected/f1a4c06a-2f49-4d49-9bd1-387e002c7922-kube-api-access-ntqdc\") pod \"dnsmasq-dns-86c887b9fc-r66cj\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.114367 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.535992 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86c887b9fc-r66cj"] Nov 25 12:26:41 crc kubenswrapper[4693]: W1125 12:26:41.543071 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1a4c06a_2f49_4d49_9bd1_387e002c7922.slice/crio-fce87573985b178b79227f32b13f09f11e182f2b1d81c774109023eb4c20f510 WatchSource:0}: Error finding container fce87573985b178b79227f32b13f09f11e182f2b1d81c774109023eb4c20f510: Status 404 returned error can't find the container with id fce87573985b178b79227f32b13f09f11e182f2b1d81c774109023eb4c20f510 Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.842758 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.972407 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb56062-ca4e-44f8-b5a1-af139e355d6e-config-data\") pod \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\" (UID: \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\") " Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.972476 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cqzg\" (UniqueName: \"kubernetes.io/projected/0fb56062-ca4e-44f8-b5a1-af139e355d6e-kube-api-access-5cqzg\") pod \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\" (UID: \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\") " Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.972573 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb56062-ca4e-44f8-b5a1-af139e355d6e-combined-ca-bundle\") pod \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\" (UID: \"0fb56062-ca4e-44f8-b5a1-af139e355d6e\") " Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.977727 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb56062-ca4e-44f8-b5a1-af139e355d6e-kube-api-access-5cqzg" (OuterVolumeSpecName: "kube-api-access-5cqzg") pod "0fb56062-ca4e-44f8-b5a1-af139e355d6e" (UID: "0fb56062-ca4e-44f8-b5a1-af139e355d6e"). InnerVolumeSpecName "kube-api-access-5cqzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:41 crc kubenswrapper[4693]: I1125 12:26:41.996316 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb56062-ca4e-44f8-b5a1-af139e355d6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fb56062-ca4e-44f8-b5a1-af139e355d6e" (UID: "0fb56062-ca4e-44f8-b5a1-af139e355d6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.031051 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb56062-ca4e-44f8-b5a1-af139e355d6e-config-data" (OuterVolumeSpecName: "config-data") pod "0fb56062-ca4e-44f8-b5a1-af139e355d6e" (UID: "0fb56062-ca4e-44f8-b5a1-af139e355d6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.079607 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fb56062-ca4e-44f8-b5a1-af139e355d6e-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.079664 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cqzg\" (UniqueName: \"kubernetes.io/projected/0fb56062-ca4e-44f8-b5a1-af139e355d6e-kube-api-access-5cqzg\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.079678 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fb56062-ca4e-44f8-b5a1-af139e355d6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.525617 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7rwc8" event={"ID":"0fb56062-ca4e-44f8-b5a1-af139e355d6e","Type":"ContainerDied","Data":"e2da903d62e8ecfab71da58157f4c061c225bc78a5f59a8da738d0cfcd6d5c69"} Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.525668 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2da903d62e8ecfab71da58157f4c061c225bc78a5f59a8da738d0cfcd6d5c69" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.525743 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7rwc8" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.530521 4693 generic.go:334] "Generic (PLEG): container finished" podID="f1a4c06a-2f49-4d49-9bd1-387e002c7922" containerID="63f5ffa878b7c972ddc0f1dc01cd81f6de968d64fc3de8a8cc4e2d8504ed576c" exitCode=0 Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.530563 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" event={"ID":"f1a4c06a-2f49-4d49-9bd1-387e002c7922","Type":"ContainerDied","Data":"63f5ffa878b7c972ddc0f1dc01cd81f6de968d64fc3de8a8cc4e2d8504ed576c"} Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.530590 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" event={"ID":"f1a4c06a-2f49-4d49-9bd1-387e002c7922","Type":"ContainerStarted","Data":"fce87573985b178b79227f32b13f09f11e182f2b1d81c774109023eb4c20f510"} Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.777126 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ksrd6"] Nov 25 12:26:42 crc kubenswrapper[4693]: E1125 12:26:42.777754 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb56062-ca4e-44f8-b5a1-af139e355d6e" containerName="keystone-db-sync" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.777785 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb56062-ca4e-44f8-b5a1-af139e355d6e" containerName="keystone-db-sync" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.777983 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb56062-ca4e-44f8-b5a1-af139e355d6e" containerName="keystone-db-sync" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.778833 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.788877 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86c887b9fc-r66cj"] Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.790781 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.791006 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.791252 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ghd6w" Nov 25 12:26:42 crc kubenswrapper[4693]: E1125 12:26:42.792498 4693 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Nov 25 12:26:42 crc kubenswrapper[4693]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/f1a4c06a-2f49-4d49-9bd1-387e002c7922/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 25 12:26:42 crc kubenswrapper[4693]: > podSandboxID="fce87573985b178b79227f32b13f09f11e182f2b1d81c774109023eb4c20f510" Nov 25 12:26:42 crc kubenswrapper[4693]: E1125 12:26:42.792698 4693 kuberuntime_manager.go:1274] "Unhandled Error" err=< Nov 25 12:26:42 crc kubenswrapper[4693]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:18f8463fe46fe6081d5682009e92bbcb3df33282b83b0a2857abaece795cf1ba,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n97h57bh654h659h5b6hbfhc4h689h565h578h56ch8dh8bh67fhf7h5f8hc7h5d4h5d5h5f7h687h5cbh5c5h5d8h68fh669h588h59bh5c6h674h5c8h5d7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ntqdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86c887b9fc-r66cj_openstack(f1a4c06a-2f49-4d49-9bd1-387e002c7922): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/f1a4c06a-2f49-4d49-9bd1-387e002c7922/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 25 12:26:42 crc kubenswrapper[4693]: > logger="UnhandledError" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.792707 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.793054 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 12:26:42 crc kubenswrapper[4693]: E1125 12:26:42.793872 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/f1a4c06a-2f49-4d49-9bd1-387e002c7922/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" podUID="f1a4c06a-2f49-4d49-9bd1-387e002c7922" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.809077 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ksrd6"] Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.839981 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8c7bdb785-9vmgv"] Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.847865 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.861027 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8c7bdb785-9vmgv"] Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.895429 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-credential-keys\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.895550 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-scripts\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.895616 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bjnm\" (UniqueName: \"kubernetes.io/projected/353ca052-e5b7-4165-a3b9-28906aeb9ecf-kube-api-access-7bjnm\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.895691 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-combined-ca-bundle\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.895786 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-config-data\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.895826 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-fernet-keys\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.957672 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5794c66b9f-22ws5"] Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.963562 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.980529 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.980747 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.981524 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.981754 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-vcjwk" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.988190 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5794c66b9f-22ws5"] Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.997918 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-fernet-keys\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.997988 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-dns-swift-storage-0\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.998018 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-ovsdbserver-nb\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.998038 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-dns-svc\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.998068 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-credential-keys\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.998115 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-scripts\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.998216 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bjnm\" (UniqueName: \"kubernetes.io/projected/353ca052-e5b7-4165-a3b9-28906aeb9ecf-kube-api-access-7bjnm\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.998265 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-ovsdbserver-sb\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.998298 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4b97\" (UniqueName: \"kubernetes.io/projected/d08b2e9e-5789-4580-82fc-47037f4995b1-kube-api-access-n4b97\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.998348 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-config\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.999215 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-combined-ca-bundle\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:42 crc kubenswrapper[4693]: I1125 12:26:42.999268 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-config-data\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.004606 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-combined-ca-bundle\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.004856 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-scripts\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.008767 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-config-data\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.013425 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-credential-keys\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.013624 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-fernet-keys\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.018453 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-kqn9c"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.019603 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.022718 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.023062 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.025070 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6pxfk" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.042233 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bjnm\" (UniqueName: \"kubernetes.io/projected/353ca052-e5b7-4165-a3b9-28906aeb9ecf-kube-api-access-7bjnm\") pod \"keystone-bootstrap-ksrd6\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.046176 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-kqn9c"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.100438 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-config\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.100497 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a798438d-196b-45a1-9e5a-20aeff4ffd8b-scripts\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.100519 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd001ffc-9a83-408f-bc46-9a7cacf052c7-etc-machine-id\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.100550 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-config-data\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.100661 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-combined-ca-bundle\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.100730 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-scripts\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.100780 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-dns-swift-storage-0\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.100803 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a798438d-196b-45a1-9e5a-20aeff4ffd8b-horizon-secret-key\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.100833 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a798438d-196b-45a1-9e5a-20aeff4ffd8b-logs\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.100858 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-ovsdbserver-nb\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.100888 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-dns-svc\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.100917 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-db-sync-config-data\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.101003 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nglx\" (UniqueName: \"kubernetes.io/projected/fd001ffc-9a83-408f-bc46-9a7cacf052c7-kube-api-access-4nglx\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.101056 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2jpn\" (UniqueName: \"kubernetes.io/projected/a798438d-196b-45a1-9e5a-20aeff4ffd8b-kube-api-access-r2jpn\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.101194 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-ovsdbserver-sb\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.101232 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a798438d-196b-45a1-9e5a-20aeff4ffd8b-config-data\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.101273 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4b97\" (UniqueName: \"kubernetes.io/projected/d08b2e9e-5789-4580-82fc-47037f4995b1-kube-api-access-n4b97\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.101641 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-dns-swift-storage-0\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.101894 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-ovsdbserver-nb\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.102197 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-ovsdbserver-sb\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.102326 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-dns-svc\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.102870 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-config\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.134015 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wqltl"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.137414 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wqltl" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.144999 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4b97\" (UniqueName: \"kubernetes.io/projected/d08b2e9e-5789-4580-82fc-47037f4995b1-kube-api-access-n4b97\") pod \"dnsmasq-dns-8c7bdb785-9vmgv\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.151089 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f2cgs" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.157725 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.165593 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-644db6f6f-q824q"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.166929 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.181334 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-68nxm"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.182712 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-68nxm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.187704 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.188717 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.188902 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.189143 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hk688" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.202722 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.203140 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wqltl"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.203770 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a798438d-196b-45a1-9e5a-20aeff4ffd8b-config-data\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.203819 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e344532-aeaf-4acf-9d1c-ebc0290e406e-combined-ca-bundle\") pod \"barbican-db-sync-wqltl\" (UID: \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\") " pod="openstack/barbican-db-sync-wqltl" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.203840 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a798438d-196b-45a1-9e5a-20aeff4ffd8b-scripts\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.203860 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd001ffc-9a83-408f-bc46-9a7cacf052c7-etc-machine-id\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.203883 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e344532-aeaf-4acf-9d1c-ebc0290e406e-db-sync-config-data\") pod \"barbican-db-sync-wqltl\" (UID: \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\") " pod="openstack/barbican-db-sync-wqltl" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.203909 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-config-data\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.203935 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-combined-ca-bundle\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.203950 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-scripts\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.203964 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a798438d-196b-45a1-9e5a-20aeff4ffd8b-horizon-secret-key\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.203976 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a798438d-196b-45a1-9e5a-20aeff4ffd8b-logs\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.204005 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-db-sync-config-data\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.204032 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nglx\" (UniqueName: \"kubernetes.io/projected/fd001ffc-9a83-408f-bc46-9a7cacf052c7-kube-api-access-4nglx\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.204054 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2jpn\" (UniqueName: \"kubernetes.io/projected/a798438d-196b-45a1-9e5a-20aeff4ffd8b-kube-api-access-r2jpn\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.204075 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp26g\" (UniqueName: \"kubernetes.io/projected/7e344532-aeaf-4acf-9d1c-ebc0290e406e-kube-api-access-sp26g\") pod \"barbican-db-sync-wqltl\" (UID: \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\") " pod="openstack/barbican-db-sync-wqltl" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.205145 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a798438d-196b-45a1-9e5a-20aeff4ffd8b-config-data\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.209233 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a798438d-196b-45a1-9e5a-20aeff4ffd8b-logs\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.209579 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd001ffc-9a83-408f-bc46-9a7cacf052c7-etc-machine-id\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.210061 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a798438d-196b-45a1-9e5a-20aeff4ffd8b-scripts\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.223627 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-644db6f6f-q824q"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.235276 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-combined-ca-bundle\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.239247 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-scripts\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.239279 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-68nxm"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.245778 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2jpn\" (UniqueName: \"kubernetes.io/projected/a798438d-196b-45a1-9e5a-20aeff4ffd8b-kube-api-access-r2jpn\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.247079 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a798438d-196b-45a1-9e5a-20aeff4ffd8b-horizon-secret-key\") pod \"horizon-5794c66b9f-22ws5\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.247157 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-config-data\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.247629 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nglx\" (UniqueName: \"kubernetes.io/projected/fd001ffc-9a83-408f-bc46-9a7cacf052c7-kube-api-access-4nglx\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.262249 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-db-sync-config-data\") pod \"cinder-db-sync-kqn9c\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.290624 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-trpsm"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.291670 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.295330 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.295587 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.296135 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p89nn" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.300813 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.308432 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e344532-aeaf-4acf-9d1c-ebc0290e406e-combined-ca-bundle\") pod \"barbican-db-sync-wqltl\" (UID: \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\") " pod="openstack/barbican-db-sync-wqltl" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.308472 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6340c0be-f12e-4dac-908a-480c7ed0e1e8-config\") pod \"neutron-db-sync-68nxm\" (UID: \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\") " pod="openstack/neutron-db-sync-68nxm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.308497 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e344532-aeaf-4acf-9d1c-ebc0290e406e-db-sync-config-data\") pod \"barbican-db-sync-wqltl\" (UID: \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\") " pod="openstack/barbican-db-sync-wqltl" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.308524 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52tr9\" (UniqueName: \"kubernetes.io/projected/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-kube-api-access-52tr9\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.308572 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-scripts\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.308594 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-logs\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.308617 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-config-data\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.308637 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-horizon-secret-key\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.308658 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp26g\" (UniqueName: \"kubernetes.io/projected/7e344532-aeaf-4acf-9d1c-ebc0290e406e-kube-api-access-sp26g\") pod \"barbican-db-sync-wqltl\" (UID: \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\") " pod="openstack/barbican-db-sync-wqltl" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.308715 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvjj9\" (UniqueName: \"kubernetes.io/projected/6340c0be-f12e-4dac-908a-480c7ed0e1e8-kube-api-access-mvjj9\") pod \"neutron-db-sync-68nxm\" (UID: \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\") " pod="openstack/neutron-db-sync-68nxm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.308736 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6340c0be-f12e-4dac-908a-480c7ed0e1e8-combined-ca-bundle\") pod \"neutron-db-sync-68nxm\" (UID: \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\") " pod="openstack/neutron-db-sync-68nxm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.318720 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e344532-aeaf-4acf-9d1c-ebc0290e406e-db-sync-config-data\") pod \"barbican-db-sync-wqltl\" (UID: \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\") " pod="openstack/barbican-db-sync-wqltl" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.323100 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e344532-aeaf-4acf-9d1c-ebc0290e406e-combined-ca-bundle\") pod \"barbican-db-sync-wqltl\" (UID: \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\") " pod="openstack/barbican-db-sync-wqltl" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.335487 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-trpsm"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.346847 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.350391 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.354892 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp26g\" (UniqueName: \"kubernetes.io/projected/7e344532-aeaf-4acf-9d1c-ebc0290e406e-kube-api-access-sp26g\") pod \"barbican-db-sync-wqltl\" (UID: \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\") " pod="openstack/barbican-db-sync-wqltl" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.357576 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.360472 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.360777 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.363480 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8c7bdb785-9vmgv"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.371710 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.425818 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvjj9\" (UniqueName: \"kubernetes.io/projected/6340c0be-f12e-4dac-908a-480c7ed0e1e8-kube-api-access-mvjj9\") pod \"neutron-db-sync-68nxm\" (UID: \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\") " pod="openstack/neutron-db-sync-68nxm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.425876 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e953d6de-5a10-4627-8a1b-654ce6219d52-logs\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.425902 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6340c0be-f12e-4dac-908a-480c7ed0e1e8-combined-ca-bundle\") pod \"neutron-db-sync-68nxm\" (UID: \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\") " pod="openstack/neutron-db-sync-68nxm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.425948 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6340c0be-f12e-4dac-908a-480c7ed0e1e8-config\") pod \"neutron-db-sync-68nxm\" (UID: \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\") " pod="openstack/neutron-db-sync-68nxm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.425994 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52tr9\" (UniqueName: \"kubernetes.io/projected/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-kube-api-access-52tr9\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.426039 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qcpd\" (UniqueName: \"kubernetes.io/projected/e953d6de-5a10-4627-8a1b-654ce6219d52-kube-api-access-6qcpd\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.426104 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-scripts\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.426139 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-logs\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.426169 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-scripts\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.426197 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-combined-ca-bundle\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.426224 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-config-data\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.426255 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-horizon-secret-key\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.426306 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-config-data\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.427628 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-logs\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.430776 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wqltl" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.432908 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-scripts\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.432968 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-config-data\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.439539 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-horizon-secret-key\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.441969 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6340c0be-f12e-4dac-908a-480c7ed0e1e8-config\") pod \"neutron-db-sync-68nxm\" (UID: \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\") " pod="openstack/neutron-db-sync-68nxm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.454070 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvjj9\" (UniqueName: \"kubernetes.io/projected/6340c0be-f12e-4dac-908a-480c7ed0e1e8-kube-api-access-mvjj9\") pod \"neutron-db-sync-68nxm\" (UID: \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\") " pod="openstack/neutron-db-sync-68nxm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.468831 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6340c0be-f12e-4dac-908a-480c7ed0e1e8-combined-ca-bundle\") pod \"neutron-db-sync-68nxm\" (UID: \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\") " pod="openstack/neutron-db-sync-68nxm" Nov 25 12:26:43 crc kubenswrapper[4693]: I1125 12:26:43.479923 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52tr9\" (UniqueName: \"kubernetes.io/projected/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-kube-api-access-52tr9\") pod \"horizon-644db6f6f-q824q\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.491594 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76c8d5b9fc-mp5vw"] Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.493677 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.496069 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76c8d5b9fc-mp5vw"] Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.503833 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-68nxm" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.530566 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-scripts\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.530632 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-combined-ca-bundle\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.530689 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.530728 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-config-data\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.530752 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-config-data\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.530808 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mbsb\" (UniqueName: \"kubernetes.io/projected/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-kube-api-access-7mbsb\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.530832 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e953d6de-5a10-4627-8a1b-654ce6219d52-logs\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.530856 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-run-httpd\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.530936 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qcpd\" (UniqueName: \"kubernetes.io/projected/e953d6de-5a10-4627-8a1b-654ce6219d52-kube-api-access-6qcpd\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.530968 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-scripts\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.530992 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-log-httpd\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.531019 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.531767 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e953d6de-5a10-4627-8a1b-654ce6219d52-logs\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.547726 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-scripts\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.555790 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-combined-ca-bundle\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.558155 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-config-data\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.565141 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qcpd\" (UniqueName: \"kubernetes.io/projected/e953d6de-5a10-4627-8a1b-654ce6219d52-kube-api-access-6qcpd\") pod \"placement-db-sync-trpsm\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.577364 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-trpsm" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632362 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-config-data\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632678 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-config\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632706 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-dns-swift-storage-0\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632753 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-ovsdbserver-nb\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632776 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mbsb\" (UniqueName: \"kubernetes.io/projected/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-kube-api-access-7mbsb\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632791 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-dns-svc\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632829 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-run-httpd\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632869 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2mpj\" (UniqueName: \"kubernetes.io/projected/4d599a55-d080-44b9-b2e3-1f94e1724ad6-kube-api-access-w2mpj\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632900 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-scripts\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632918 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-log-httpd\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632936 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632977 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.632998 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-ovsdbserver-sb\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.639047 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-config-data\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.640765 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.641031 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-log-httpd\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.641234 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-run-httpd\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.643168 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-scripts\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.645020 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.660320 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mbsb\" (UniqueName: \"kubernetes.io/projected/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-kube-api-access-7mbsb\") pod \"ceilometer-0\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.734332 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-ovsdbserver-sb\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.734634 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-config\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.734668 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-dns-swift-storage-0\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.734729 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-ovsdbserver-nb\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.734749 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-dns-svc\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.734793 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2mpj\" (UniqueName: \"kubernetes.io/projected/4d599a55-d080-44b9-b2e3-1f94e1724ad6-kube-api-access-w2mpj\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.735676 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-config\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.735966 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-dns-svc\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.735966 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-ovsdbserver-sb\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.736191 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-dns-swift-storage-0\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.736972 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-ovsdbserver-nb\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.746999 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.751023 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2mpj\" (UniqueName: \"kubernetes.io/projected/4d599a55-d080-44b9-b2e3-1f94e1724ad6-kube-api-access-w2mpj\") pod \"dnsmasq-dns-76c8d5b9fc-mp5vw\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.895690 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:43.908826 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.402938 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8c7bdb785-9vmgv"] Nov 25 12:26:44 crc kubenswrapper[4693]: W1125 12:26:44.453463 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd08b2e9e_5789_4580_82fc_47037f4995b1.slice/crio-1e092b5677cfc76b1ef4c650186df7d704cd779e083a69bfbefac4aa4be0b2d6 WatchSource:0}: Error finding container 1e092b5677cfc76b1ef4c650186df7d704cd779e083a69bfbefac4aa4be0b2d6: Status 404 returned error can't find the container with id 1e092b5677cfc76b1ef4c650186df7d704cd779e083a69bfbefac4aa4be0b2d6 Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.568232 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" event={"ID":"d08b2e9e-5789-4580-82fc-47037f4995b1","Type":"ContainerStarted","Data":"1e092b5677cfc76b1ef4c650186df7d704cd779e083a69bfbefac4aa4be0b2d6"} Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.575916 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.759271 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-ovsdbserver-nb\") pod \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.759327 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-ovsdbserver-sb\") pod \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.759898 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-dns-svc\") pod \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.759935 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntqdc\" (UniqueName: \"kubernetes.io/projected/f1a4c06a-2f49-4d49-9bd1-387e002c7922-kube-api-access-ntqdc\") pod \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.760015 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-config\") pod \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.760087 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-dns-swift-storage-0\") pod \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\" (UID: \"f1a4c06a-2f49-4d49-9bd1-387e002c7922\") " Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.769015 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a4c06a-2f49-4d49-9bd1-387e002c7922-kube-api-access-ntqdc" (OuterVolumeSpecName: "kube-api-access-ntqdc") pod "f1a4c06a-2f49-4d49-9bd1-387e002c7922" (UID: "f1a4c06a-2f49-4d49-9bd1-387e002c7922"). InnerVolumeSpecName "kube-api-access-ntqdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.807224 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1a4c06a-2f49-4d49-9bd1-387e002c7922" (UID: "f1a4c06a-2f49-4d49-9bd1-387e002c7922"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.808807 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1a4c06a-2f49-4d49-9bd1-387e002c7922" (UID: "f1a4c06a-2f49-4d49-9bd1-387e002c7922"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.810190 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1a4c06a-2f49-4d49-9bd1-387e002c7922" (UID: "f1a4c06a-2f49-4d49-9bd1-387e002c7922"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.825422 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-config" (OuterVolumeSpecName: "config") pod "f1a4c06a-2f49-4d49-9bd1-387e002c7922" (UID: "f1a4c06a-2f49-4d49-9bd1-387e002c7922"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.831655 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f1a4c06a-2f49-4d49-9bd1-387e002c7922" (UID: "f1a4c06a-2f49-4d49-9bd1-387e002c7922"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.861687 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.861726 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.861738 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.861747 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntqdc\" (UniqueName: \"kubernetes.io/projected/f1a4c06a-2f49-4d49-9bd1-387e002c7922-kube-api-access-ntqdc\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.861757 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.861765 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1a4c06a-2f49-4d49-9bd1-387e002c7922-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.921439 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-kqn9c"] Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.927884 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5794c66b9f-22ws5"] Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.940876 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wqltl"] Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.953187 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-68nxm"] Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.963674 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-644db6f6f-q824q"] Nov 25 12:26:44 crc kubenswrapper[4693]: W1125 12:26:44.963985 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda798438d_196b_45a1_9e5a_20aeff4ffd8b.slice/crio-c5ebf4f0cb0d0255dcfe7c0b91977f61957d7e95b0203ed002642e327dcce38b WatchSource:0}: Error finding container c5ebf4f0cb0d0255dcfe7c0b91977f61957d7e95b0203ed002642e327dcce38b: Status 404 returned error can't find the container with id c5ebf4f0cb0d0255dcfe7c0b91977f61957d7e95b0203ed002642e327dcce38b Nov 25 12:26:44 crc kubenswrapper[4693]: W1125 12:26:44.965544 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e344532_aeaf_4acf_9d1c_ebc0290e406e.slice/crio-7f8e1fee741ea05a7ac2d90f027a194b7215b4c0f0483e26d1dfbcf608c25f42 WatchSource:0}: Error finding container 7f8e1fee741ea05a7ac2d90f027a194b7215b4c0f0483e26d1dfbcf608c25f42: Status 404 returned error can't find the container with id 7f8e1fee741ea05a7ac2d90f027a194b7215b4c0f0483e26d1dfbcf608c25f42 Nov 25 12:26:44 crc kubenswrapper[4693]: W1125 12:26:44.965874 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6340c0be_f12e_4dac_908a_480c7ed0e1e8.slice/crio-a3f51d2e9004a9fcbb50f357e8f16a140fabb016b12e5087568895e434908bbf WatchSource:0}: Error finding container a3f51d2e9004a9fcbb50f357e8f16a140fabb016b12e5087568895e434908bbf: Status 404 returned error can't find the container with id a3f51d2e9004a9fcbb50f357e8f16a140fabb016b12e5087568895e434908bbf Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.970458 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ksrd6"] Nov 25 12:26:44 crc kubenswrapper[4693]: W1125 12:26:44.976206 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod432eeae5_f9f6_49cc_ab8d_d024cfed66cb.slice/crio-35db3d2ebbf57222f6b6facc672200069ba51dd47406e92d1a190d9d0f077b79 WatchSource:0}: Error finding container 35db3d2ebbf57222f6b6facc672200069ba51dd47406e92d1a190d9d0f077b79: Status 404 returned error can't find the container with id 35db3d2ebbf57222f6b6facc672200069ba51dd47406e92d1a190d9d0f077b79 Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.986662 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-trpsm"] Nov 25 12:26:44 crc kubenswrapper[4693]: W1125 12:26:44.992055 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod353ca052_e5b7_4165_a3b9_28906aeb9ecf.slice/crio-2cf462fb05ca3e2fdbda6bd0f5ff4d92db923f28807ca6e26e588ba6f044a563 WatchSource:0}: Error finding container 2cf462fb05ca3e2fdbda6bd0f5ff4d92db923f28807ca6e26e588ba6f044a563: Status 404 returned error can't find the container with id 2cf462fb05ca3e2fdbda6bd0f5ff4d92db923f28807ca6e26e588ba6f044a563 Nov 25 12:26:44 crc kubenswrapper[4693]: I1125 12:26:44.995544 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76c8d5b9fc-mp5vw"] Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.002242 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.578150 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wqltl" event={"ID":"7e344532-aeaf-4acf-9d1c-ebc0290e406e","Type":"ContainerStarted","Data":"7f8e1fee741ea05a7ac2d90f027a194b7215b4c0f0483e26d1dfbcf608c25f42"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.579620 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kqn9c" event={"ID":"fd001ffc-9a83-408f-bc46-9a7cacf052c7","Type":"ContainerStarted","Data":"54ca02c8bc011bc57eaeecf8ede7c61737d9421bfa9ad17258191c951084fc91"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.580480 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-644db6f6f-q824q" event={"ID":"432eeae5-f9f6-49cc-ab8d-d024cfed66cb","Type":"ContainerStarted","Data":"35db3d2ebbf57222f6b6facc672200069ba51dd47406e92d1a190d9d0f077b79"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.581548 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-trpsm" event={"ID":"e953d6de-5a10-4627-8a1b-654ce6219d52","Type":"ContainerStarted","Data":"2784d862924b2b818aa285fcc287e7187dcb511add39a5ca310784a07aefff71"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.583607 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ksrd6" event={"ID":"353ca052-e5b7-4165-a3b9-28906aeb9ecf","Type":"ContainerStarted","Data":"961aaf75dc8888f81ceeb9e48896e6b94d151b30cf1e38ef118b0cbec141e375"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.583635 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ksrd6" event={"ID":"353ca052-e5b7-4165-a3b9-28906aeb9ecf","Type":"ContainerStarted","Data":"2cf462fb05ca3e2fdbda6bd0f5ff4d92db923f28807ca6e26e588ba6f044a563"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.587876 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qdmcl" event={"ID":"e8135cf2-4e92-4e70-9c47-f5fae388c0be","Type":"ContainerStarted","Data":"8f7f05217b5ef487dfd9d9d2732365c15c6ba795b96a06a0e8ae5807fc0afe5e"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.589823 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5794c66b9f-22ws5" event={"ID":"a798438d-196b-45a1-9e5a-20aeff4ffd8b","Type":"ContainerStarted","Data":"c5ebf4f0cb0d0255dcfe7c0b91977f61957d7e95b0203ed002642e327dcce38b"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.591120 4693 generic.go:334] "Generic (PLEG): container finished" podID="4d599a55-d080-44b9-b2e3-1f94e1724ad6" containerID="2eb4cf38a68361738c7dbe495c83e8a0047e30c72dc62f70f6869ed112b42ddd" exitCode=0 Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.591170 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" event={"ID":"4d599a55-d080-44b9-b2e3-1f94e1724ad6","Type":"ContainerDied","Data":"2eb4cf38a68361738c7dbe495c83e8a0047e30c72dc62f70f6869ed112b42ddd"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.591184 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" event={"ID":"4d599a55-d080-44b9-b2e3-1f94e1724ad6","Type":"ContainerStarted","Data":"454a9d3ca8dd0c1496dd35195579f55681943c8a9aaaf82d9a5d15ab92acfe7b"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.595970 4693 generic.go:334] "Generic (PLEG): container finished" podID="d08b2e9e-5789-4580-82fc-47037f4995b1" containerID="cace8bf1dacf9b476cb824c3a7603286ae2f067896ef9111f8293b541bc624b2" exitCode=0 Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.596099 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" event={"ID":"d08b2e9e-5789-4580-82fc-47037f4995b1","Type":"ContainerDied","Data":"cace8bf1dacf9b476cb824c3a7603286ae2f067896ef9111f8293b541bc624b2"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.603150 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a41873-5fe3-4e4d-9a0c-556e6c85919d","Type":"ContainerStarted","Data":"ba1d20e04b6de6e5c625328ba0d87d54c8292f30776bc738678e90558665f7b4"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.613181 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ksrd6" podStartSLOduration=3.613157039 podStartE2EDuration="3.613157039s" podCreationTimestamp="2025-11-25 12:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:45.603728392 +0000 UTC m=+1125.521813773" watchObservedRunningTime="2025-11-25 12:26:45.613157039 +0000 UTC m=+1125.531242420" Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.619649 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-68nxm" event={"ID":"6340c0be-f12e-4dac-908a-480c7ed0e1e8","Type":"ContainerStarted","Data":"f27fcc7bf6f370050ccc53520f9ced274f34d5f7fa6e8acf1c73bc6da3c1a826"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.619710 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-68nxm" event={"ID":"6340c0be-f12e-4dac-908a-480c7ed0e1e8","Type":"ContainerStarted","Data":"a3f51d2e9004a9fcbb50f357e8f16a140fabb016b12e5087568895e434908bbf"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.632348 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" event={"ID":"f1a4c06a-2f49-4d49-9bd1-387e002c7922","Type":"ContainerDied","Data":"fce87573985b178b79227f32b13f09f11e182f2b1d81c774109023eb4c20f510"} Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.632429 4693 scope.go:117] "RemoveContainer" containerID="63f5ffa878b7c972ddc0f1dc01cd81f6de968d64fc3de8a8cc4e2d8504ed576c" Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.632647 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86c887b9fc-r66cj" Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.749115 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-qdmcl" podStartSLOduration=3.152197124 podStartE2EDuration="33.749076986s" podCreationTimestamp="2025-11-25 12:26:12 +0000 UTC" firstStartedPulling="2025-11-25 12:26:13.68415822 +0000 UTC m=+1093.602243601" lastFinishedPulling="2025-11-25 12:26:44.281038082 +0000 UTC m=+1124.199123463" observedRunningTime="2025-11-25 12:26:45.723545131 +0000 UTC m=+1125.641630522" watchObservedRunningTime="2025-11-25 12:26:45.749076986 +0000 UTC m=+1125.667162357" Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.770806 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-68nxm" podStartSLOduration=2.770787081 podStartE2EDuration="2.770787081s" podCreationTimestamp="2025-11-25 12:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:45.759040458 +0000 UTC m=+1125.677125849" watchObservedRunningTime="2025-11-25 12:26:45.770787081 +0000 UTC m=+1125.688872452" Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.864771 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86c887b9fc-r66cj"] Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.900399 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86c887b9fc-r66cj"] Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.979840 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5794c66b9f-22ws5"] Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.998939 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6dc8b554d5-7gtks"] Nov 25 12:26:45 crc kubenswrapper[4693]: E1125 12:26:45.999293 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a4c06a-2f49-4d49-9bd1-387e002c7922" containerName="init" Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.999312 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a4c06a-2f49-4d49-9bd1-387e002c7922" containerName="init" Nov 25 12:26:45 crc kubenswrapper[4693]: I1125 12:26:45.999511 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a4c06a-2f49-4d49-9bd1-387e002c7922" containerName="init" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.000433 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.035157 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dc8b554d5-7gtks"] Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.076325 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.103667 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a054005c-e9b4-40f1-b795-4afc6df458f5-scripts\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.103762 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a054005c-e9b4-40f1-b795-4afc6df458f5-config-data\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.103794 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a054005c-e9b4-40f1-b795-4afc6df458f5-horizon-secret-key\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.103825 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gsnd\" (UniqueName: \"kubernetes.io/projected/a054005c-e9b4-40f1-b795-4afc6df458f5-kube-api-access-9gsnd\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.103864 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a054005c-e9b4-40f1-b795-4afc6df458f5-logs\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.200309 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.206355 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a054005c-e9b4-40f1-b795-4afc6df458f5-horizon-secret-key\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.206443 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gsnd\" (UniqueName: \"kubernetes.io/projected/a054005c-e9b4-40f1-b795-4afc6df458f5-kube-api-access-9gsnd\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.206479 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a054005c-e9b4-40f1-b795-4afc6df458f5-logs\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.206573 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a054005c-e9b4-40f1-b795-4afc6df458f5-scripts\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.206616 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a054005c-e9b4-40f1-b795-4afc6df458f5-config-data\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.207248 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a054005c-e9b4-40f1-b795-4afc6df458f5-logs\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.207480 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a054005c-e9b4-40f1-b795-4afc6df458f5-scripts\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.207786 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a054005c-e9b4-40f1-b795-4afc6df458f5-config-data\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.231990 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gsnd\" (UniqueName: \"kubernetes.io/projected/a054005c-e9b4-40f1-b795-4afc6df458f5-kube-api-access-9gsnd\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.239804 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a054005c-e9b4-40f1-b795-4afc6df458f5-horizon-secret-key\") pod \"horizon-6dc8b554d5-7gtks\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.308193 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-config\") pod \"d08b2e9e-5789-4580-82fc-47037f4995b1\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.308433 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-ovsdbserver-sb\") pod \"d08b2e9e-5789-4580-82fc-47037f4995b1\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.308801 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-dns-svc\") pod \"d08b2e9e-5789-4580-82fc-47037f4995b1\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.308843 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-dns-swift-storage-0\") pod \"d08b2e9e-5789-4580-82fc-47037f4995b1\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.308886 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-ovsdbserver-nb\") pod \"d08b2e9e-5789-4580-82fc-47037f4995b1\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.309006 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4b97\" (UniqueName: \"kubernetes.io/projected/d08b2e9e-5789-4580-82fc-47037f4995b1-kube-api-access-n4b97\") pod \"d08b2e9e-5789-4580-82fc-47037f4995b1\" (UID: \"d08b2e9e-5789-4580-82fc-47037f4995b1\") " Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.332584 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08b2e9e-5789-4580-82fc-47037f4995b1-kube-api-access-n4b97" (OuterVolumeSpecName: "kube-api-access-n4b97") pod "d08b2e9e-5789-4580-82fc-47037f4995b1" (UID: "d08b2e9e-5789-4580-82fc-47037f4995b1"). InnerVolumeSpecName "kube-api-access-n4b97". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.347579 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.361482 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-config" (OuterVolumeSpecName: "config") pod "d08b2e9e-5789-4580-82fc-47037f4995b1" (UID: "d08b2e9e-5789-4580-82fc-47037f4995b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.361960 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d08b2e9e-5789-4580-82fc-47037f4995b1" (UID: "d08b2e9e-5789-4580-82fc-47037f4995b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.375684 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d08b2e9e-5789-4580-82fc-47037f4995b1" (UID: "d08b2e9e-5789-4580-82fc-47037f4995b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.385424 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d08b2e9e-5789-4580-82fc-47037f4995b1" (UID: "d08b2e9e-5789-4580-82fc-47037f4995b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.393441 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d08b2e9e-5789-4580-82fc-47037f4995b1" (UID: "d08b2e9e-5789-4580-82fc-47037f4995b1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.417451 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4b97\" (UniqueName: \"kubernetes.io/projected/d08b2e9e-5789-4580-82fc-47037f4995b1-kube-api-access-n4b97\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.417484 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.417497 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.417508 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.417519 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.417528 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d08b2e9e-5789-4580-82fc-47037f4995b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.706666 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" event={"ID":"4d599a55-d080-44b9-b2e3-1f94e1724ad6","Type":"ContainerStarted","Data":"bc7e5d881cdd7f561933c7688600977769a0fecd40b302575bb4ea5fd25ae2d9"} Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.707521 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.733525 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" event={"ID":"d08b2e9e-5789-4580-82fc-47037f4995b1","Type":"ContainerDied","Data":"1e092b5677cfc76b1ef4c650186df7d704cd779e083a69bfbefac4aa4be0b2d6"} Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.733619 4693 scope.go:117] "RemoveContainer" containerID="cace8bf1dacf9b476cb824c3a7603286ae2f067896ef9111f8293b541bc624b2" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.733923 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8c7bdb785-9vmgv" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.848506 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" podStartSLOduration=3.848476179 podStartE2EDuration="3.848476179s" podCreationTimestamp="2025-11-25 12:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:26:46.752757854 +0000 UTC m=+1126.670843235" watchObservedRunningTime="2025-11-25 12:26:46.848476179 +0000 UTC m=+1126.766561660" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.858304 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a4c06a-2f49-4d49-9bd1-387e002c7922" path="/var/lib/kubelet/pods/f1a4c06a-2f49-4d49-9bd1-387e002c7922/volumes" Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.883573 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8c7bdb785-9vmgv"] Nov 25 12:26:46 crc kubenswrapper[4693]: I1125 12:26:46.912757 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8c7bdb785-9vmgv"] Nov 25 12:26:47 crc kubenswrapper[4693]: I1125 12:26:47.129613 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dc8b554d5-7gtks"] Nov 25 12:26:47 crc kubenswrapper[4693]: I1125 12:26:47.758810 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dc8b554d5-7gtks" event={"ID":"a054005c-e9b4-40f1-b795-4afc6df458f5","Type":"ContainerStarted","Data":"e5b9cc018ae3b77de4238fc51d13d7575ae98812b64848521c4e8602bd86694c"} Nov 25 12:26:48 crc kubenswrapper[4693]: I1125 12:26:48.841020 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d08b2e9e-5789-4580-82fc-47037f4995b1" path="/var/lib/kubelet/pods/d08b2e9e-5789-4580-82fc-47037f4995b1/volumes" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.191260 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-644db6f6f-q824q"] Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.220988 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7bbcbd4584-78jln"] Nov 25 12:26:52 crc kubenswrapper[4693]: E1125 12:26:52.221645 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08b2e9e-5789-4580-82fc-47037f4995b1" containerName="init" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.221733 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08b2e9e-5789-4580-82fc-47037f4995b1" containerName="init" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.229534 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08b2e9e-5789-4580-82fc-47037f4995b1" containerName="init" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.231437 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.238306 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.254586 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bbcbd4584-78jln"] Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.359690 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6dc8b554d5-7gtks"] Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.375159 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f60abf7-3c23-4174-9150-50061c054cf5-logs\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.375229 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-horizon-secret-key\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.375247 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f60abf7-3c23-4174-9150-50061c054cf5-config-data\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.375277 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-horizon-tls-certs\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.375322 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-combined-ca-bundle\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.375351 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-885nr\" (UniqueName: \"kubernetes.io/projected/1f60abf7-3c23-4174-9150-50061c054cf5-kube-api-access-885nr\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.375414 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f60abf7-3c23-4174-9150-50061c054cf5-scripts\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.408666 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-574fd6fdfd-bz6sm"] Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.413935 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.440206 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-574fd6fdfd-bz6sm"] Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.479221 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-combined-ca-bundle\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.479367 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-885nr\" (UniqueName: \"kubernetes.io/projected/1f60abf7-3c23-4174-9150-50061c054cf5-kube-api-access-885nr\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.479805 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f60abf7-3c23-4174-9150-50061c054cf5-scripts\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.479928 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f60abf7-3c23-4174-9150-50061c054cf5-logs\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.480093 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-horizon-secret-key\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.480156 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f60abf7-3c23-4174-9150-50061c054cf5-config-data\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.480280 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-horizon-tls-certs\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.482315 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f60abf7-3c23-4174-9150-50061c054cf5-logs\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.483060 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f60abf7-3c23-4174-9150-50061c054cf5-scripts\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.485521 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f60abf7-3c23-4174-9150-50061c054cf5-config-data\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.488207 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-horizon-secret-key\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.488803 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-horizon-tls-certs\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.489685 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-combined-ca-bundle\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.519722 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-885nr\" (UniqueName: \"kubernetes.io/projected/1f60abf7-3c23-4174-9150-50061c054cf5-kube-api-access-885nr\") pod \"horizon-7bbcbd4584-78jln\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.563895 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.585886 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvddn\" (UniqueName: \"kubernetes.io/projected/14ff5a36-1912-43a8-b87f-57a6858a5799-kube-api-access-cvddn\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.585945 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14ff5a36-1912-43a8-b87f-57a6858a5799-horizon-secret-key\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.585973 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14ff5a36-1912-43a8-b87f-57a6858a5799-config-data\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.586033 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14ff5a36-1912-43a8-b87f-57a6858a5799-scripts\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.586065 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ff5a36-1912-43a8-b87f-57a6858a5799-logs\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.586110 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ff5a36-1912-43a8-b87f-57a6858a5799-horizon-tls-certs\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.586147 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ff5a36-1912-43a8-b87f-57a6858a5799-combined-ca-bundle\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.688102 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ff5a36-1912-43a8-b87f-57a6858a5799-horizon-tls-certs\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.688172 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ff5a36-1912-43a8-b87f-57a6858a5799-combined-ca-bundle\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.688239 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvddn\" (UniqueName: \"kubernetes.io/projected/14ff5a36-1912-43a8-b87f-57a6858a5799-kube-api-access-cvddn\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.688259 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14ff5a36-1912-43a8-b87f-57a6858a5799-horizon-secret-key\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.688281 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14ff5a36-1912-43a8-b87f-57a6858a5799-config-data\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.688331 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14ff5a36-1912-43a8-b87f-57a6858a5799-scripts\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.688358 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ff5a36-1912-43a8-b87f-57a6858a5799-logs\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.884773 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14ff5a36-1912-43a8-b87f-57a6858a5799-logs\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:52 crc kubenswrapper[4693]: I1125 12:26:52.887481 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14ff5a36-1912-43a8-b87f-57a6858a5799-scripts\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:53 crc kubenswrapper[4693]: I1125 12:26:53.427805 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/14ff5a36-1912-43a8-b87f-57a6858a5799-config-data\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:53 crc kubenswrapper[4693]: I1125 12:26:53.429416 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvddn\" (UniqueName: \"kubernetes.io/projected/14ff5a36-1912-43a8-b87f-57a6858a5799-kube-api-access-cvddn\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:53 crc kubenswrapper[4693]: I1125 12:26:53.431056 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/14ff5a36-1912-43a8-b87f-57a6858a5799-horizon-secret-key\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:53 crc kubenswrapper[4693]: I1125 12:26:53.431076 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/14ff5a36-1912-43a8-b87f-57a6858a5799-horizon-tls-certs\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:53 crc kubenswrapper[4693]: I1125 12:26:53.431549 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14ff5a36-1912-43a8-b87f-57a6858a5799-combined-ca-bundle\") pod \"horizon-574fd6fdfd-bz6sm\" (UID: \"14ff5a36-1912-43a8-b87f-57a6858a5799\") " pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:53 crc kubenswrapper[4693]: I1125 12:26:53.646907 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:26:53 crc kubenswrapper[4693]: I1125 12:26:53.910563 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:26:53 crc kubenswrapper[4693]: I1125 12:26:53.966462 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf8bcbfcf-pwdqc"] Nov 25 12:26:53 crc kubenswrapper[4693]: I1125 12:26:53.966733 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" podUID="4decc5db-72bf-4967-b2dd-6f843f9fb3ce" containerName="dnsmasq-dns" containerID="cri-o://00a0e5266a15e605189392cd68b00a1bdb5aa566f7412a99d381fb13ef268035" gracePeriod=10 Nov 25 12:26:54 crc kubenswrapper[4693]: I1125 12:26:54.386465 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" podUID="4decc5db-72bf-4967-b2dd-6f843f9fb3ce" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Nov 25 12:26:54 crc kubenswrapper[4693]: I1125 12:26:54.822998 4693 generic.go:334] "Generic (PLEG): container finished" podID="4decc5db-72bf-4967-b2dd-6f843f9fb3ce" containerID="00a0e5266a15e605189392cd68b00a1bdb5aa566f7412a99d381fb13ef268035" exitCode=0 Nov 25 12:26:54 crc kubenswrapper[4693]: I1125 12:26:54.835523 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" event={"ID":"4decc5db-72bf-4967-b2dd-6f843f9fb3ce","Type":"ContainerDied","Data":"00a0e5266a15e605189392cd68b00a1bdb5aa566f7412a99d381fb13ef268035"} Nov 25 12:26:55 crc kubenswrapper[4693]: I1125 12:26:55.838749 4693 generic.go:334] "Generic (PLEG): container finished" podID="353ca052-e5b7-4165-a3b9-28906aeb9ecf" containerID="961aaf75dc8888f81ceeb9e48896e6b94d151b30cf1e38ef118b0cbec141e375" exitCode=0 Nov 25 12:26:55 crc kubenswrapper[4693]: I1125 12:26:55.838799 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ksrd6" event={"ID":"353ca052-e5b7-4165-a3b9-28906aeb9ecf","Type":"ContainerDied","Data":"961aaf75dc8888f81ceeb9e48896e6b94d151b30cf1e38ef118b0cbec141e375"} Nov 25 12:26:59 crc kubenswrapper[4693]: I1125 12:26:59.386671 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" podUID="4decc5db-72bf-4967-b2dd-6f843f9fb3ce" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Nov 25 12:27:04 crc kubenswrapper[4693]: I1125 12:27:04.386810 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" podUID="4decc5db-72bf-4967-b2dd-6f843f9fb3ce" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Nov 25 12:27:04 crc kubenswrapper[4693]: I1125 12:27:04.387882 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:27:04 crc kubenswrapper[4693]: E1125 12:27:04.399283 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057" Nov 25 12:27:04 crc kubenswrapper[4693]: E1125 12:27:04.399487 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5fbh679h695hfch548hb7h5fh5d4h5dh7ch78h574h578h564h684h55dh5fdh5d8h8chfh5d4hb5h54dh57ch6bh587h5bch56fh5f8h576h659hdbq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9gsnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6dc8b554d5-7gtks_openstack(a054005c-e9b4-40f1-b795-4afc6df458f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:27:04 crc kubenswrapper[4693]: E1125 12:27:04.402636 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057\\\"\"]" pod="openstack/horizon-6dc8b554d5-7gtks" podUID="a054005c-e9b4-40f1-b795-4afc6df458f5" Nov 25 12:27:04 crc kubenswrapper[4693]: E1125 12:27:04.415696 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057" Nov 25 12:27:04 crc kubenswrapper[4693]: E1125 12:27:04.415864 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n698h5bhfh5fdh8h88h55ch646h5c6h55dh5dch98hf5hfbh86h5fch577h595h679h5cch97hddh5d4hc9h6chfch66dh5b4h69h68dhb4hc8q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2jpn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5794c66b9f-22ws5_openstack(a798438d-196b-45a1-9e5a-20aeff4ffd8b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:27:04 crc kubenswrapper[4693]: E1125 12:27:04.421289 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057\\\"\"]" pod="openstack/horizon-5794c66b9f-22ws5" podUID="a798438d-196b-45a1-9e5a-20aeff4ffd8b" Nov 25 12:27:04 crc kubenswrapper[4693]: E1125 12:27:04.433820 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057" Nov 25 12:27:04 crc kubenswrapper[4693]: E1125 12:27:04.433999 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5f8h5bdh6dhfdh84h544h9dh88h89h5fdhbchd8hfbh65bhc9h59ch55dh6fh64fh654h95h68dh564h5dfh676h649hf7h5c5h8bh5dbh565h68cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-52tr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-644db6f6f-q824q_openstack(432eeae5-f9f6-49cc-ab8d-d024cfed66cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:27:04 crc kubenswrapper[4693]: E1125 12:27:04.436122 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:65b94ff9fcd486845fb0544583bf2a973246a61a0ad32340fb92d632285f1057\\\"\"]" pod="openstack/horizon-644db6f6f-q824q" podUID="432eeae5-f9f6-49cc-ab8d-d024cfed66cb" Nov 25 12:27:06 crc kubenswrapper[4693]: I1125 12:27:06.918654 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:27:06 crc kubenswrapper[4693]: I1125 12:27:06.926641 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ksrd6" event={"ID":"353ca052-e5b7-4165-a3b9-28906aeb9ecf","Type":"ContainerDied","Data":"2cf462fb05ca3e2fdbda6bd0f5ff4d92db923f28807ca6e26e588ba6f044a563"} Nov 25 12:27:06 crc kubenswrapper[4693]: I1125 12:27:06.926678 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cf462fb05ca3e2fdbda6bd0f5ff4d92db923f28807ca6e26e588ba6f044a563" Nov 25 12:27:06 crc kubenswrapper[4693]: I1125 12:27:06.926714 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ksrd6" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.095500 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bjnm\" (UniqueName: \"kubernetes.io/projected/353ca052-e5b7-4165-a3b9-28906aeb9ecf-kube-api-access-7bjnm\") pod \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.095665 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-credential-keys\") pod \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.095778 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-fernet-keys\") pod \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.095818 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-config-data\") pod \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.095886 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-scripts\") pod \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.095916 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-combined-ca-bundle\") pod \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\" (UID: \"353ca052-e5b7-4165-a3b9-28906aeb9ecf\") " Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.102088 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "353ca052-e5b7-4165-a3b9-28906aeb9ecf" (UID: "353ca052-e5b7-4165-a3b9-28906aeb9ecf"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.106347 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-scripts" (OuterVolumeSpecName: "scripts") pod "353ca052-e5b7-4165-a3b9-28906aeb9ecf" (UID: "353ca052-e5b7-4165-a3b9-28906aeb9ecf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.106437 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "353ca052-e5b7-4165-a3b9-28906aeb9ecf" (UID: "353ca052-e5b7-4165-a3b9-28906aeb9ecf"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.106810 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/353ca052-e5b7-4165-a3b9-28906aeb9ecf-kube-api-access-7bjnm" (OuterVolumeSpecName: "kube-api-access-7bjnm") pod "353ca052-e5b7-4165-a3b9-28906aeb9ecf" (UID: "353ca052-e5b7-4165-a3b9-28906aeb9ecf"). InnerVolumeSpecName "kube-api-access-7bjnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.136149 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-config-data" (OuterVolumeSpecName: "config-data") pod "353ca052-e5b7-4165-a3b9-28906aeb9ecf" (UID: "353ca052-e5b7-4165-a3b9-28906aeb9ecf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.145911 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "353ca052-e5b7-4165-a3b9-28906aeb9ecf" (UID: "353ca052-e5b7-4165-a3b9-28906aeb9ecf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.198921 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.199173 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.199232 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bjnm\" (UniqueName: \"kubernetes.io/projected/353ca052-e5b7-4165-a3b9-28906aeb9ecf-kube-api-access-7bjnm\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.199282 4693 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.199328 4693 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.199392 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/353ca052-e5b7-4165-a3b9-28906aeb9ecf-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.475924 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:27:07 crc kubenswrapper[4693]: E1125 12:27:07.543184 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:d375d370be5ead0dac71109af644849e5795f535f9ad8eeacea261d77ae6f140" Nov 25 12:27:07 crc kubenswrapper[4693]: E1125 12:27:07.543363 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:d375d370be5ead0dac71109af644849e5795f535f9ad8eeacea261d77ae6f140,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n676h5bch4hbfh555h674h55dh574h5f4h5c8h558h67h6ch654h68dh5c6h546h5d5h6dh5cdh5c7h54ch9ch648hddh677h57fh5c5h69h5fbh9bh95q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7mbsb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(a0a41873-5fe3-4e4d-9a0c-556e6c85919d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.604489 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-dns-svc\") pod \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.604569 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-ovsdbserver-sb\") pod \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.604626 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pnhk\" (UniqueName: \"kubernetes.io/projected/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-kube-api-access-5pnhk\") pod \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.604710 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-ovsdbserver-nb\") pod \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.604731 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-config\") pod \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\" (UID: \"4decc5db-72bf-4967-b2dd-6f843f9fb3ce\") " Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.608058 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-kube-api-access-5pnhk" (OuterVolumeSpecName: "kube-api-access-5pnhk") pod "4decc5db-72bf-4967-b2dd-6f843f9fb3ce" (UID: "4decc5db-72bf-4967-b2dd-6f843f9fb3ce"). InnerVolumeSpecName "kube-api-access-5pnhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.647406 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-config" (OuterVolumeSpecName: "config") pod "4decc5db-72bf-4967-b2dd-6f843f9fb3ce" (UID: "4decc5db-72bf-4967-b2dd-6f843f9fb3ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.649582 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4decc5db-72bf-4967-b2dd-6f843f9fb3ce" (UID: "4decc5db-72bf-4967-b2dd-6f843f9fb3ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.649664 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4decc5db-72bf-4967-b2dd-6f843f9fb3ce" (UID: "4decc5db-72bf-4967-b2dd-6f843f9fb3ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.662893 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4decc5db-72bf-4967-b2dd-6f843f9fb3ce" (UID: "4decc5db-72bf-4967-b2dd-6f843f9fb3ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.706147 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.706182 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.706193 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.706205 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pnhk\" (UniqueName: \"kubernetes.io/projected/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-kube-api-access-5pnhk\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.706214 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4decc5db-72bf-4967-b2dd-6f843f9fb3ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.935321 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" event={"ID":"4decc5db-72bf-4967-b2dd-6f843f9fb3ce","Type":"ContainerDied","Data":"ef065f49dcc4dc2f027c68898e9f67e61c584283253a0c0201f9fda3060f8708"} Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.935367 4693 scope.go:117] "RemoveContainer" containerID="00a0e5266a15e605189392cd68b00a1bdb5aa566f7412a99d381fb13ef268035" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.935500 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf8bcbfcf-pwdqc" Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.989127 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf8bcbfcf-pwdqc"] Nov 25 12:27:07 crc kubenswrapper[4693]: I1125 12:27:07.997140 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf8bcbfcf-pwdqc"] Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.013474 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ksrd6"] Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.021394 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ksrd6"] Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.099010 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-k89hc"] Nov 25 12:27:08 crc kubenswrapper[4693]: E1125 12:27:08.099397 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4decc5db-72bf-4967-b2dd-6f843f9fb3ce" containerName="dnsmasq-dns" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.099417 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4decc5db-72bf-4967-b2dd-6f843f9fb3ce" containerName="dnsmasq-dns" Nov 25 12:27:08 crc kubenswrapper[4693]: E1125 12:27:08.099434 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353ca052-e5b7-4165-a3b9-28906aeb9ecf" containerName="keystone-bootstrap" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.099440 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="353ca052-e5b7-4165-a3b9-28906aeb9ecf" containerName="keystone-bootstrap" Nov 25 12:27:08 crc kubenswrapper[4693]: E1125 12:27:08.099455 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4decc5db-72bf-4967-b2dd-6f843f9fb3ce" containerName="init" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.099461 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4decc5db-72bf-4967-b2dd-6f843f9fb3ce" containerName="init" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.099626 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="353ca052-e5b7-4165-a3b9-28906aeb9ecf" containerName="keystone-bootstrap" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.099641 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4decc5db-72bf-4967-b2dd-6f843f9fb3ce" containerName="dnsmasq-dns" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.100181 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.111105 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k89hc"] Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.158288 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.158609 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ghd6w" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.158777 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.158941 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.159098 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.214303 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-fernet-keys\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.214575 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99gt8\" (UniqueName: \"kubernetes.io/projected/064804d9-6da0-42a1-b0bd-9505f77588f8-kube-api-access-99gt8\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.214599 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-credential-keys\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.214742 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-combined-ca-bundle\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.214774 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-config-data\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.214808 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-scripts\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.316430 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-combined-ca-bundle\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.316489 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-config-data\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.316527 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-scripts\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.316571 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-fernet-keys\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.316598 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99gt8\" (UniqueName: \"kubernetes.io/projected/064804d9-6da0-42a1-b0bd-9505f77588f8-kube-api-access-99gt8\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.316618 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-credential-keys\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.322285 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-config-data\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.322699 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-credential-keys\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.326667 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-scripts\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.328614 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-combined-ca-bundle\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.334728 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99gt8\" (UniqueName: \"kubernetes.io/projected/064804d9-6da0-42a1-b0bd-9505f77588f8-kube-api-access-99gt8\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.338935 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-fernet-keys\") pod \"keystone-bootstrap-k89hc\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.478423 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.825023 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="353ca052-e5b7-4165-a3b9-28906aeb9ecf" path="/var/lib/kubelet/pods/353ca052-e5b7-4165-a3b9-28906aeb9ecf/volumes" Nov 25 12:27:08 crc kubenswrapper[4693]: I1125 12:27:08.825647 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4decc5db-72bf-4967-b2dd-6f843f9fb3ce" path="/var/lib/kubelet/pods/4decc5db-72bf-4967-b2dd-6f843f9fb3ce/volumes" Nov 25 12:27:21 crc kubenswrapper[4693]: E1125 12:27:21.337662 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:4c93a5cccb9971e24f05daf93b3aa11ba71752bc3469a1a1a2c4906f92f69645" Nov 25 12:27:21 crc kubenswrapper[4693]: E1125 12:27:21.338279 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:4c93a5cccb9971e24f05daf93b3aa11ba71752bc3469a1a1a2c4906f92f69645,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sp26g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-wqltl_openstack(7e344532-aeaf-4acf-9d1c-ebc0290e406e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:27:21 crc kubenswrapper[4693]: E1125 12:27:21.339779 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-wqltl" podUID="7e344532-aeaf-4acf-9d1c-ebc0290e406e" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.470000 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.481564 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.498183 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.548399 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a054005c-e9b4-40f1-b795-4afc6df458f5-logs\") pod \"a054005c-e9b4-40f1-b795-4afc6df458f5\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.549122 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-scripts\") pod \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.549251 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52tr9\" (UniqueName: \"kubernetes.io/projected/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-kube-api-access-52tr9\") pod \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.549396 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a054005c-e9b4-40f1-b795-4afc6df458f5-scripts\") pod \"a054005c-e9b4-40f1-b795-4afc6df458f5\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.549534 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2jpn\" (UniqueName: \"kubernetes.io/projected/a798438d-196b-45a1-9e5a-20aeff4ffd8b-kube-api-access-r2jpn\") pod \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.549660 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-logs\") pod \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.549777 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a798438d-196b-45a1-9e5a-20aeff4ffd8b-horizon-secret-key\") pod \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.549867 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-horizon-secret-key\") pod \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.549911 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a054005c-e9b4-40f1-b795-4afc6df458f5-logs" (OuterVolumeSpecName: "logs") pod "a054005c-e9b4-40f1-b795-4afc6df458f5" (UID: "a054005c-e9b4-40f1-b795-4afc6df458f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.550055 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-scripts" (OuterVolumeSpecName: "scripts") pod "432eeae5-f9f6-49cc-ab8d-d024cfed66cb" (UID: "432eeae5-f9f6-49cc-ab8d-d024cfed66cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.550108 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a054005c-e9b4-40f1-b795-4afc6df458f5-scripts" (OuterVolumeSpecName: "scripts") pod "a054005c-e9b4-40f1-b795-4afc6df458f5" (UID: "a054005c-e9b4-40f1-b795-4afc6df458f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.550232 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-logs" (OuterVolumeSpecName: "logs") pod "432eeae5-f9f6-49cc-ab8d-d024cfed66cb" (UID: "432eeae5-f9f6-49cc-ab8d-d024cfed66cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.550213 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gsnd\" (UniqueName: \"kubernetes.io/projected/a054005c-e9b4-40f1-b795-4afc6df458f5-kube-api-access-9gsnd\") pod \"a054005c-e9b4-40f1-b795-4afc6df458f5\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.550851 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a054005c-e9b4-40f1-b795-4afc6df458f5-config-data\") pod \"a054005c-e9b4-40f1-b795-4afc6df458f5\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.550937 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a798438d-196b-45a1-9e5a-20aeff4ffd8b-config-data\") pod \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.550982 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a798438d-196b-45a1-9e5a-20aeff4ffd8b-logs\") pod \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.551061 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a798438d-196b-45a1-9e5a-20aeff4ffd8b-scripts\") pod \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\" (UID: \"a798438d-196b-45a1-9e5a-20aeff4ffd8b\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.551102 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a054005c-e9b4-40f1-b795-4afc6df458f5-horizon-secret-key\") pod \"a054005c-e9b4-40f1-b795-4afc6df458f5\" (UID: \"a054005c-e9b4-40f1-b795-4afc6df458f5\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.551148 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-config-data\") pod \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\" (UID: \"432eeae5-f9f6-49cc-ab8d-d024cfed66cb\") " Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.551693 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a054005c-e9b4-40f1-b795-4afc6df458f5-config-data" (OuterVolumeSpecName: "config-data") pod "a054005c-e9b4-40f1-b795-4afc6df458f5" (UID: "a054005c-e9b4-40f1-b795-4afc6df458f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.551725 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a798438d-196b-45a1-9e5a-20aeff4ffd8b-config-data" (OuterVolumeSpecName: "config-data") pod "a798438d-196b-45a1-9e5a-20aeff4ffd8b" (UID: "a798438d-196b-45a1-9e5a-20aeff4ffd8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.552231 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-config-data" (OuterVolumeSpecName: "config-data") pod "432eeae5-f9f6-49cc-ab8d-d024cfed66cb" (UID: "432eeae5-f9f6-49cc-ab8d-d024cfed66cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.552512 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a798438d-196b-45a1-9e5a-20aeff4ffd8b-logs" (OuterVolumeSpecName: "logs") pod "a798438d-196b-45a1-9e5a-20aeff4ffd8b" (UID: "a798438d-196b-45a1-9e5a-20aeff4ffd8b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.552600 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a798438d-196b-45a1-9e5a-20aeff4ffd8b-scripts" (OuterVolumeSpecName: "scripts") pod "a798438d-196b-45a1-9e5a-20aeff4ffd8b" (UID: "a798438d-196b-45a1-9e5a-20aeff4ffd8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.552725 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a798438d-196b-45a1-9e5a-20aeff4ffd8b-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.552754 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.552768 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a054005c-e9b4-40f1-b795-4afc6df458f5-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.552782 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.552794 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a054005c-e9b4-40f1-b795-4afc6df458f5-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.552805 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.552815 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a054005c-e9b4-40f1-b795-4afc6df458f5-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.552825 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a798438d-196b-45a1-9e5a-20aeff4ffd8b-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.557444 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a054005c-e9b4-40f1-b795-4afc6df458f5-kube-api-access-9gsnd" (OuterVolumeSpecName: "kube-api-access-9gsnd") pod "a054005c-e9b4-40f1-b795-4afc6df458f5" (UID: "a054005c-e9b4-40f1-b795-4afc6df458f5"). InnerVolumeSpecName "kube-api-access-9gsnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.558113 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a798438d-196b-45a1-9e5a-20aeff4ffd8b-kube-api-access-r2jpn" (OuterVolumeSpecName: "kube-api-access-r2jpn") pod "a798438d-196b-45a1-9e5a-20aeff4ffd8b" (UID: "a798438d-196b-45a1-9e5a-20aeff4ffd8b"). InnerVolumeSpecName "kube-api-access-r2jpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.558251 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-kube-api-access-52tr9" (OuterVolumeSpecName: "kube-api-access-52tr9") pod "432eeae5-f9f6-49cc-ab8d-d024cfed66cb" (UID: "432eeae5-f9f6-49cc-ab8d-d024cfed66cb"). InnerVolumeSpecName "kube-api-access-52tr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.558598 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a054005c-e9b4-40f1-b795-4afc6df458f5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a054005c-e9b4-40f1-b795-4afc6df458f5" (UID: "a054005c-e9b4-40f1-b795-4afc6df458f5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.558713 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a798438d-196b-45a1-9e5a-20aeff4ffd8b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a798438d-196b-45a1-9e5a-20aeff4ffd8b" (UID: "a798438d-196b-45a1-9e5a-20aeff4ffd8b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.570676 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "432eeae5-f9f6-49cc-ab8d-d024cfed66cb" (UID: "432eeae5-f9f6-49cc-ab8d-d024cfed66cb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.654496 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gsnd\" (UniqueName: \"kubernetes.io/projected/a054005c-e9b4-40f1-b795-4afc6df458f5-kube-api-access-9gsnd\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.654540 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a798438d-196b-45a1-9e5a-20aeff4ffd8b-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.654553 4693 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a054005c-e9b4-40f1-b795-4afc6df458f5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.654563 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52tr9\" (UniqueName: \"kubernetes.io/projected/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-kube-api-access-52tr9\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.654576 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2jpn\" (UniqueName: \"kubernetes.io/projected/a798438d-196b-45a1-9e5a-20aeff4ffd8b-kube-api-access-r2jpn\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.654587 4693 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a798438d-196b-45a1-9e5a-20aeff4ffd8b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:21 crc kubenswrapper[4693]: I1125 12:27:21.654597 4693 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/432eeae5-f9f6-49cc-ab8d-d024cfed66cb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.052318 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5794c66b9f-22ws5" event={"ID":"a798438d-196b-45a1-9e5a-20aeff4ffd8b","Type":"ContainerDied","Data":"c5ebf4f0cb0d0255dcfe7c0b91977f61957d7e95b0203ed002642e327dcce38b"} Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.052354 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5794c66b9f-22ws5" Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.054627 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dc8b554d5-7gtks" Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.054643 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dc8b554d5-7gtks" event={"ID":"a054005c-e9b4-40f1-b795-4afc6df458f5","Type":"ContainerDied","Data":"e5b9cc018ae3b77de4238fc51d13d7575ae98812b64848521c4e8602bd86694c"} Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.058033 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-644db6f6f-q824q" Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.059338 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-644db6f6f-q824q" event={"ID":"432eeae5-f9f6-49cc-ab8d-d024cfed66cb","Type":"ContainerDied","Data":"35db3d2ebbf57222f6b6facc672200069ba51dd47406e92d1a190d9d0f077b79"} Nov 25 12:27:22 crc kubenswrapper[4693]: E1125 12:27:22.060314 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:4c93a5cccb9971e24f05daf93b3aa11ba71752bc3469a1a1a2c4906f92f69645\\\"\"" pod="openstack/barbican-db-sync-wqltl" podUID="7e344532-aeaf-4acf-9d1c-ebc0290e406e" Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.134808 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6dc8b554d5-7gtks"] Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.141502 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6dc8b554d5-7gtks"] Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.169074 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5794c66b9f-22ws5"] Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.182402 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5794c66b9f-22ws5"] Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.206271 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-644db6f6f-q824q"] Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.213344 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-644db6f6f-q824q"] Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.824057 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="432eeae5-f9f6-49cc-ab8d-d024cfed66cb" path="/var/lib/kubelet/pods/432eeae5-f9f6-49cc-ab8d-d024cfed66cb/volumes" Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.824799 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a054005c-e9b4-40f1-b795-4afc6df458f5" path="/var/lib/kubelet/pods/a054005c-e9b4-40f1-b795-4afc6df458f5/volumes" Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.825230 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a798438d-196b-45a1-9e5a-20aeff4ffd8b" path="/var/lib/kubelet/pods/a798438d-196b-45a1-9e5a-20aeff4ffd8b/volumes" Nov 25 12:27:22 crc kubenswrapper[4693]: I1125 12:27:22.989762 4693 scope.go:117] "RemoveContainer" containerID="a9973023fc1425d422e338854564bb3bb495e093c37e20d9d5b36e71f6924fed" Nov 25 12:27:23 crc kubenswrapper[4693]: E1125 12:27:23.023574 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879" Nov 25 12:27:23 crc kubenswrapper[4693]: E1125 12:27:23.023753 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4nglx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-kqn9c_openstack(fd001ffc-9a83-408f-bc46-9a7cacf052c7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:27:23 crc kubenswrapper[4693]: E1125 12:27:23.025021 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-kqn9c" podUID="fd001ffc-9a83-408f-bc46-9a7cacf052c7" Nov 25 12:27:23 crc kubenswrapper[4693]: E1125 12:27:23.075814 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:37d64e0a00c54e71a4c1fcbbbf7e832f6886ffd03c9a02b6ee3ca48fabc30879\\\"\"" pod="openstack/cinder-db-sync-kqn9c" podUID="fd001ffc-9a83-408f-bc46-9a7cacf052c7" Nov 25 12:27:23 crc kubenswrapper[4693]: I1125 12:27:23.264719 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bbcbd4584-78jln"] Nov 25 12:27:23 crc kubenswrapper[4693]: W1125 12:27:23.458556 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f60abf7_3c23_4174_9150_50061c054cf5.slice/crio-5fbd3f1b8bdb7e8b933ea91a128eb10c755273c96443365bb0cf920e3a71e90d WatchSource:0}: Error finding container 5fbd3f1b8bdb7e8b933ea91a128eb10c755273c96443365bb0cf920e3a71e90d: Status 404 returned error can't find the container with id 5fbd3f1b8bdb7e8b933ea91a128eb10c755273c96443365bb0cf920e3a71e90d Nov 25 12:27:23 crc kubenswrapper[4693]: I1125 12:27:23.504229 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k89hc"] Nov 25 12:27:23 crc kubenswrapper[4693]: I1125 12:27:23.511002 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-574fd6fdfd-bz6sm"] Nov 25 12:27:23 crc kubenswrapper[4693]: W1125 12:27:23.512511 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ff5a36_1912_43a8_b87f_57a6858a5799.slice/crio-5090108ae1cbf1f64d2521936d129be95cfd6d05cf4b4878d5a829b1e2631031 WatchSource:0}: Error finding container 5090108ae1cbf1f64d2521936d129be95cfd6d05cf4b4878d5a829b1e2631031: Status 404 returned error can't find the container with id 5090108ae1cbf1f64d2521936d129be95cfd6d05cf4b4878d5a829b1e2631031 Nov 25 12:27:23 crc kubenswrapper[4693]: W1125 12:27:23.516765 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod064804d9_6da0_42a1_b0bd_9505f77588f8.slice/crio-443eee529491fe2166bcc3328b35e95af93deec248c2b94ebf9a5b33ae258618 WatchSource:0}: Error finding container 443eee529491fe2166bcc3328b35e95af93deec248c2b94ebf9a5b33ae258618: Status 404 returned error can't find the container with id 443eee529491fe2166bcc3328b35e95af93deec248c2b94ebf9a5b33ae258618 Nov 25 12:27:24 crc kubenswrapper[4693]: I1125 12:27:24.083997 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-574fd6fdfd-bz6sm" event={"ID":"14ff5a36-1912-43a8-b87f-57a6858a5799","Type":"ContainerStarted","Data":"5090108ae1cbf1f64d2521936d129be95cfd6d05cf4b4878d5a829b1e2631031"} Nov 25 12:27:24 crc kubenswrapper[4693]: I1125 12:27:24.086301 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-trpsm" event={"ID":"e953d6de-5a10-4627-8a1b-654ce6219d52","Type":"ContainerStarted","Data":"26492e710508f2f724b717ad10b8e86c2276539896a9d034e87babd3e7345c30"} Nov 25 12:27:24 crc kubenswrapper[4693]: I1125 12:27:24.088616 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbcbd4584-78jln" event={"ID":"1f60abf7-3c23-4174-9150-50061c054cf5","Type":"ContainerStarted","Data":"5fbd3f1b8bdb7e8b933ea91a128eb10c755273c96443365bb0cf920e3a71e90d"} Nov 25 12:27:24 crc kubenswrapper[4693]: I1125 12:27:24.090664 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k89hc" event={"ID":"064804d9-6da0-42a1-b0bd-9505f77588f8","Type":"ContainerStarted","Data":"fd1b4bd19e829de777b80ab83cf74a05573309a57924b5a358ccc1f4d874ed0f"} Nov 25 12:27:24 crc kubenswrapper[4693]: I1125 12:27:24.090728 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k89hc" event={"ID":"064804d9-6da0-42a1-b0bd-9505f77588f8","Type":"ContainerStarted","Data":"443eee529491fe2166bcc3328b35e95af93deec248c2b94ebf9a5b33ae258618"} Nov 25 12:27:24 crc kubenswrapper[4693]: I1125 12:27:24.097182 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a41873-5fe3-4e4d-9a0c-556e6c85919d","Type":"ContainerStarted","Data":"5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4"} Nov 25 12:27:24 crc kubenswrapper[4693]: I1125 12:27:24.114469 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-trpsm" podStartSLOduration=4.79304913 podStartE2EDuration="41.114447366s" podCreationTimestamp="2025-11-25 12:26:43 +0000 UTC" firstStartedPulling="2025-11-25 12:26:45.007202066 +0000 UTC m=+1124.925287447" lastFinishedPulling="2025-11-25 12:27:21.328600302 +0000 UTC m=+1161.246685683" observedRunningTime="2025-11-25 12:27:24.104953776 +0000 UTC m=+1164.023039157" watchObservedRunningTime="2025-11-25 12:27:24.114447366 +0000 UTC m=+1164.032532747" Nov 25 12:27:24 crc kubenswrapper[4693]: I1125 12:27:24.142989 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-k89hc" podStartSLOduration=16.142929694 podStartE2EDuration="16.142929694s" podCreationTimestamp="2025-11-25 12:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:24.140734761 +0000 UTC m=+1164.058820142" watchObservedRunningTime="2025-11-25 12:27:24.142929694 +0000 UTC m=+1164.061015075" Nov 25 12:27:25 crc kubenswrapper[4693]: I1125 12:27:25.110506 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbcbd4584-78jln" event={"ID":"1f60abf7-3c23-4174-9150-50061c054cf5","Type":"ContainerStarted","Data":"d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f"} Nov 25 12:27:25 crc kubenswrapper[4693]: I1125 12:27:25.110802 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbcbd4584-78jln" event={"ID":"1f60abf7-3c23-4174-9150-50061c054cf5","Type":"ContainerStarted","Data":"5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38"} Nov 25 12:27:25 crc kubenswrapper[4693]: I1125 12:27:25.146085 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7bbcbd4584-78jln" podStartSLOduration=32.510290946 podStartE2EDuration="33.146060795s" podCreationTimestamp="2025-11-25 12:26:52 +0000 UTC" firstStartedPulling="2025-11-25 12:27:23.465478062 +0000 UTC m=+1163.383563443" lastFinishedPulling="2025-11-25 12:27:24.101247891 +0000 UTC m=+1164.019333292" observedRunningTime="2025-11-25 12:27:25.13422913 +0000 UTC m=+1165.052314521" watchObservedRunningTime="2025-11-25 12:27:25.146060795 +0000 UTC m=+1165.064146176" Nov 25 12:27:28 crc kubenswrapper[4693]: I1125 12:27:28.149131 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-574fd6fdfd-bz6sm" event={"ID":"14ff5a36-1912-43a8-b87f-57a6858a5799","Type":"ContainerStarted","Data":"7015db1713ac905c37c33dd11e6e78ee53d936c53beda9b2ea11e01650a950e9"} Nov 25 12:27:29 crc kubenswrapper[4693]: I1125 12:27:29.160042 4693 generic.go:334] "Generic (PLEG): container finished" podID="064804d9-6da0-42a1-b0bd-9505f77588f8" containerID="fd1b4bd19e829de777b80ab83cf74a05573309a57924b5a358ccc1f4d874ed0f" exitCode=0 Nov 25 12:27:29 crc kubenswrapper[4693]: I1125 12:27:29.160139 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k89hc" event={"ID":"064804d9-6da0-42a1-b0bd-9505f77588f8","Type":"ContainerDied","Data":"fd1b4bd19e829de777b80ab83cf74a05573309a57924b5a358ccc1f4d874ed0f"} Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.169467 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-574fd6fdfd-bz6sm" event={"ID":"14ff5a36-1912-43a8-b87f-57a6858a5799","Type":"ContainerStarted","Data":"18f8a4cc21bc01e46aa58c98cda229ce94d0a8ff950a682c9949f4b742d396cb"} Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.171748 4693 generic.go:334] "Generic (PLEG): container finished" podID="e8135cf2-4e92-4e70-9c47-f5fae388c0be" containerID="8f7f05217b5ef487dfd9d9d2732365c15c6ba795b96a06a0e8ae5807fc0afe5e" exitCode=0 Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.172045 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qdmcl" event={"ID":"e8135cf2-4e92-4e70-9c47-f5fae388c0be","Type":"ContainerDied","Data":"8f7f05217b5ef487dfd9d9d2732365c15c6ba795b96a06a0e8ae5807fc0afe5e"} Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.521281 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.663640 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-config-data\") pod \"064804d9-6da0-42a1-b0bd-9505f77588f8\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.663685 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99gt8\" (UniqueName: \"kubernetes.io/projected/064804d9-6da0-42a1-b0bd-9505f77588f8-kube-api-access-99gt8\") pod \"064804d9-6da0-42a1-b0bd-9505f77588f8\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.663745 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-fernet-keys\") pod \"064804d9-6da0-42a1-b0bd-9505f77588f8\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.663860 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-scripts\") pod \"064804d9-6da0-42a1-b0bd-9505f77588f8\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.663907 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-credential-keys\") pod \"064804d9-6da0-42a1-b0bd-9505f77588f8\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.663981 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-combined-ca-bundle\") pod \"064804d9-6da0-42a1-b0bd-9505f77588f8\" (UID: \"064804d9-6da0-42a1-b0bd-9505f77588f8\") " Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.671148 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "064804d9-6da0-42a1-b0bd-9505f77588f8" (UID: "064804d9-6da0-42a1-b0bd-9505f77588f8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.671983 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/064804d9-6da0-42a1-b0bd-9505f77588f8-kube-api-access-99gt8" (OuterVolumeSpecName: "kube-api-access-99gt8") pod "064804d9-6da0-42a1-b0bd-9505f77588f8" (UID: "064804d9-6da0-42a1-b0bd-9505f77588f8"). InnerVolumeSpecName "kube-api-access-99gt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.671979 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "064804d9-6da0-42a1-b0bd-9505f77588f8" (UID: "064804d9-6da0-42a1-b0bd-9505f77588f8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.676517 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-scripts" (OuterVolumeSpecName: "scripts") pod "064804d9-6da0-42a1-b0bd-9505f77588f8" (UID: "064804d9-6da0-42a1-b0bd-9505f77588f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.694141 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "064804d9-6da0-42a1-b0bd-9505f77588f8" (UID: "064804d9-6da0-42a1-b0bd-9505f77588f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.694482 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-config-data" (OuterVolumeSpecName: "config-data") pod "064804d9-6da0-42a1-b0bd-9505f77588f8" (UID: "064804d9-6da0-42a1-b0bd-9505f77588f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.766448 4693 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.766491 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.766520 4693 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.766534 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.766547 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/064804d9-6da0-42a1-b0bd-9505f77588f8-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:30 crc kubenswrapper[4693]: I1125 12:27:30.766558 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99gt8\" (UniqueName: \"kubernetes.io/projected/064804d9-6da0-42a1-b0bd-9505f77588f8-kube-api-access-99gt8\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.182783 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a41873-5fe3-4e4d-9a0c-556e6c85919d","Type":"ContainerStarted","Data":"d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6"} Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.185260 4693 generic.go:334] "Generic (PLEG): container finished" podID="e953d6de-5a10-4627-8a1b-654ce6219d52" containerID="26492e710508f2f724b717ad10b8e86c2276539896a9d034e87babd3e7345c30" exitCode=0 Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.185347 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-trpsm" event={"ID":"e953d6de-5a10-4627-8a1b-654ce6219d52","Type":"ContainerDied","Data":"26492e710508f2f724b717ad10b8e86c2276539896a9d034e87babd3e7345c30"} Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.187502 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k89hc" event={"ID":"064804d9-6da0-42a1-b0bd-9505f77588f8","Type":"ContainerDied","Data":"443eee529491fe2166bcc3328b35e95af93deec248c2b94ebf9a5b33ae258618"} Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.187531 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="443eee529491fe2166bcc3328b35e95af93deec248c2b94ebf9a5b33ae258618" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.187723 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k89hc" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.246852 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-574fd6fdfd-bz6sm" podStartSLOduration=37.561405793 podStartE2EDuration="39.246832864s" podCreationTimestamp="2025-11-25 12:26:52 +0000 UTC" firstStartedPulling="2025-11-25 12:27:23.514686808 +0000 UTC m=+1163.432772189" lastFinishedPulling="2025-11-25 12:27:25.200113879 +0000 UTC m=+1165.118199260" observedRunningTime="2025-11-25 12:27:31.232017074 +0000 UTC m=+1171.150102475" watchObservedRunningTime="2025-11-25 12:27:31.246832864 +0000 UTC m=+1171.164918245" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.454835 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7888468d67-2bztz"] Nov 25 12:27:31 crc kubenswrapper[4693]: E1125 12:27:31.455300 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="064804d9-6da0-42a1-b0bd-9505f77588f8" containerName="keystone-bootstrap" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.455326 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="064804d9-6da0-42a1-b0bd-9505f77588f8" containerName="keystone-bootstrap" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.455609 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="064804d9-6da0-42a1-b0bd-9505f77588f8" containerName="keystone-bootstrap" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.456317 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.459300 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.459737 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.459980 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ghd6w" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.460070 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.460929 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.461326 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.485647 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7888468d67-2bztz"] Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.577680 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-scripts\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.577748 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-public-tls-certs\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.577780 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-fernet-keys\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.577847 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hhw8\" (UniqueName: \"kubernetes.io/projected/e3c87b9d-25f9-445f-be14-b43f1cb887a4-kube-api-access-2hhw8\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.577881 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-internal-tls-certs\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.577926 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-combined-ca-bundle\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.578013 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-credential-keys\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.578093 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-config-data\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.680015 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-public-tls-certs\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.680276 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-fernet-keys\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.680300 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hhw8\" (UniqueName: \"kubernetes.io/projected/e3c87b9d-25f9-445f-be14-b43f1cb887a4-kube-api-access-2hhw8\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.680340 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-internal-tls-certs\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.680393 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-combined-ca-bundle\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.680433 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-credential-keys\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.680498 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-config-data\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.680561 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-scripts\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.685686 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-internal-tls-certs\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.687582 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-scripts\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.689337 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-config-data\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.699490 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-public-tls-certs\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.699587 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-credential-keys\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.699744 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-fernet-keys\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.702930 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hhw8\" (UniqueName: \"kubernetes.io/projected/e3c87b9d-25f9-445f-be14-b43f1cb887a4-kube-api-access-2hhw8\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.718087 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c87b9d-25f9-445f-be14-b43f1cb887a4-combined-ca-bundle\") pod \"keystone-7888468d67-2bztz\" (UID: \"e3c87b9d-25f9-445f-be14-b43f1cb887a4\") " pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.782024 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.900642 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qdmcl" Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.992716 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-combined-ca-bundle\") pod \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.993524 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vfxm\" (UniqueName: \"kubernetes.io/projected/e8135cf2-4e92-4e70-9c47-f5fae388c0be-kube-api-access-6vfxm\") pod \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.993628 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-db-sync-config-data\") pod \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " Nov 25 12:27:31 crc kubenswrapper[4693]: I1125 12:27:31.993705 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-config-data\") pod \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\" (UID: \"e8135cf2-4e92-4e70-9c47-f5fae388c0be\") " Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.003428 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e8135cf2-4e92-4e70-9c47-f5fae388c0be" (UID: "e8135cf2-4e92-4e70-9c47-f5fae388c0be"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.004569 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8135cf2-4e92-4e70-9c47-f5fae388c0be-kube-api-access-6vfxm" (OuterVolumeSpecName: "kube-api-access-6vfxm") pod "e8135cf2-4e92-4e70-9c47-f5fae388c0be" (UID: "e8135cf2-4e92-4e70-9c47-f5fae388c0be"). InnerVolumeSpecName "kube-api-access-6vfxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.088932 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8135cf2-4e92-4e70-9c47-f5fae388c0be" (UID: "e8135cf2-4e92-4e70-9c47-f5fae388c0be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.089133 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-config-data" (OuterVolumeSpecName: "config-data") pod "e8135cf2-4e92-4e70-9c47-f5fae388c0be" (UID: "e8135cf2-4e92-4e70-9c47-f5fae388c0be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.097745 4693 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.097771 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.097779 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8135cf2-4e92-4e70-9c47-f5fae388c0be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.097788 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vfxm\" (UniqueName: \"kubernetes.io/projected/e8135cf2-4e92-4e70-9c47-f5fae388c0be-kube-api-access-6vfxm\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.149884 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7888468d67-2bztz"] Nov 25 12:27:32 crc kubenswrapper[4693]: W1125 12:27:32.154035 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3c87b9d_25f9_445f_be14_b43f1cb887a4.slice/crio-feecfbcf12ae5bc9d1547074259d827c22b17c5507f8ec9a6712b44ba05cfe71 WatchSource:0}: Error finding container feecfbcf12ae5bc9d1547074259d827c22b17c5507f8ec9a6712b44ba05cfe71: Status 404 returned error can't find the container with id feecfbcf12ae5bc9d1547074259d827c22b17c5507f8ec9a6712b44ba05cfe71 Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.210983 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7888468d67-2bztz" event={"ID":"e3c87b9d-25f9-445f-be14-b43f1cb887a4","Type":"ContainerStarted","Data":"feecfbcf12ae5bc9d1547074259d827c22b17c5507f8ec9a6712b44ba05cfe71"} Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.212235 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qdmcl" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.212267 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qdmcl" event={"ID":"e8135cf2-4e92-4e70-9c47-f5fae388c0be","Type":"ContainerDied","Data":"336effd30cf85b5126a77a64c8b26cb86ce36376c66c6054a7caf962fba260db"} Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.212289 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="336effd30cf85b5126a77a64c8b26cb86ce36376c66c6054a7caf962fba260db" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.534332 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-trpsm" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.564694 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.565151 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.595020 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-798745f775-7dmwn"] Nov 25 12:27:32 crc kubenswrapper[4693]: E1125 12:27:32.595530 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8135cf2-4e92-4e70-9c47-f5fae388c0be" containerName="glance-db-sync" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.595549 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8135cf2-4e92-4e70-9c47-f5fae388c0be" containerName="glance-db-sync" Nov 25 12:27:32 crc kubenswrapper[4693]: E1125 12:27:32.595568 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e953d6de-5a10-4627-8a1b-654ce6219d52" containerName="placement-db-sync" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.595575 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e953d6de-5a10-4627-8a1b-654ce6219d52" containerName="placement-db-sync" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.595763 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="e953d6de-5a10-4627-8a1b-654ce6219d52" containerName="placement-db-sync" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.595784 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8135cf2-4e92-4e70-9c47-f5fae388c0be" containerName="glance-db-sync" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.596682 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.611898 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-scripts\") pod \"e953d6de-5a10-4627-8a1b-654ce6219d52\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.612002 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-config-data\") pod \"e953d6de-5a10-4627-8a1b-654ce6219d52\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.612047 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qcpd\" (UniqueName: \"kubernetes.io/projected/e953d6de-5a10-4627-8a1b-654ce6219d52-kube-api-access-6qcpd\") pod \"e953d6de-5a10-4627-8a1b-654ce6219d52\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.612101 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-combined-ca-bundle\") pod \"e953d6de-5a10-4627-8a1b-654ce6219d52\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.612230 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e953d6de-5a10-4627-8a1b-654ce6219d52-logs\") pod \"e953d6de-5a10-4627-8a1b-654ce6219d52\" (UID: \"e953d6de-5a10-4627-8a1b-654ce6219d52\") " Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.614747 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e953d6de-5a10-4627-8a1b-654ce6219d52-logs" (OuterVolumeSpecName: "logs") pod "e953d6de-5a10-4627-8a1b-654ce6219d52" (UID: "e953d6de-5a10-4627-8a1b-654ce6219d52"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.633449 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-scripts" (OuterVolumeSpecName: "scripts") pod "e953d6de-5a10-4627-8a1b-654ce6219d52" (UID: "e953d6de-5a10-4627-8a1b-654ce6219d52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.633594 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-798745f775-7dmwn"] Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.673078 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e953d6de-5a10-4627-8a1b-654ce6219d52-kube-api-access-6qcpd" (OuterVolumeSpecName: "kube-api-access-6qcpd") pod "e953d6de-5a10-4627-8a1b-654ce6219d52" (UID: "e953d6de-5a10-4627-8a1b-654ce6219d52"). InnerVolumeSpecName "kube-api-access-6qcpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.702273 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e953d6de-5a10-4627-8a1b-654ce6219d52" (UID: "e953d6de-5a10-4627-8a1b-654ce6219d52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.703868 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-config-data" (OuterVolumeSpecName: "config-data") pod "e953d6de-5a10-4627-8a1b-654ce6219d52" (UID: "e953d6de-5a10-4627-8a1b-654ce6219d52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.715638 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-dns-svc\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.715765 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2f64\" (UniqueName: \"kubernetes.io/projected/99307935-f048-43df-85fe-54ff4b925e7f-kube-api-access-l2f64\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.715897 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-dns-swift-storage-0\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.715972 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-ovsdbserver-nb\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.716095 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-config\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.716241 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-ovsdbserver-sb\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.716683 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e953d6de-5a10-4627-8a1b-654ce6219d52-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.716767 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.716846 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.716947 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qcpd\" (UniqueName: \"kubernetes.io/projected/e953d6de-5a10-4627-8a1b-654ce6219d52-kube-api-access-6qcpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.716990 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e953d6de-5a10-4627-8a1b-654ce6219d52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.817791 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-dns-svc\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.817836 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2f64\" (UniqueName: \"kubernetes.io/projected/99307935-f048-43df-85fe-54ff4b925e7f-kube-api-access-l2f64\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.817858 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-dns-swift-storage-0\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.817877 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-ovsdbserver-nb\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.817895 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-config\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.817924 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-ovsdbserver-sb\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.818957 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-ovsdbserver-sb\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.819893 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-dns-swift-storage-0\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.820249 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-ovsdbserver-nb\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.828019 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-dns-svc\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.828496 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-config\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.849434 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2f64\" (UniqueName: \"kubernetes.io/projected/99307935-f048-43df-85fe-54ff4b925e7f-kube-api-access-l2f64\") pod \"dnsmasq-dns-798745f775-7dmwn\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:32 crc kubenswrapper[4693]: I1125 12:27:32.936681 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.258254 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-trpsm" event={"ID":"e953d6de-5a10-4627-8a1b-654ce6219d52","Type":"ContainerDied","Data":"2784d862924b2b818aa285fcc287e7187dcb511add39a5ca310784a07aefff71"} Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.258574 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2784d862924b2b818aa285fcc287e7187dcb511add39a5ca310784a07aefff71" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.258341 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-trpsm" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.266556 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7888468d67-2bztz" event={"ID":"e3c87b9d-25f9-445f-be14-b43f1cb887a4","Type":"ContainerStarted","Data":"b0736318ed54accef682b6c86eff31170d2406bec489d3ae0fab894ed44b0bfe"} Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.266614 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.303794 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7888468d67-2bztz" podStartSLOduration=2.303777137 podStartE2EDuration="2.303777137s" podCreationTimestamp="2025-11-25 12:27:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:33.292321112 +0000 UTC m=+1173.210406503" watchObservedRunningTime="2025-11-25 12:27:33.303777137 +0000 UTC m=+1173.221862518" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.325844 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7bf98548b6-68m92"] Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.327654 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.331205 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7bf98548b6-68m92"] Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.331682 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.340787 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.341465 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p89nn" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.341834 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.341996 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.436832 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-internal-tls-certs\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.437199 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-logs\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.437229 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-public-tls-certs\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.437278 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-scripts\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.437310 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kdgl\" (UniqueName: \"kubernetes.io/projected/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-kube-api-access-5kdgl\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.437327 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-combined-ca-bundle\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.437413 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-config-data\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.538849 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-config-data\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.538937 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-internal-tls-certs\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.539046 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-logs\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.539179 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-public-tls-certs\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.539225 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-scripts\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.539264 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kdgl\" (UniqueName: \"kubernetes.io/projected/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-kube-api-access-5kdgl\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.539295 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-combined-ca-bundle\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.607098 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.613604 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.620988 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.621448 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.621508 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-q2hwj" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.626710 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.647732 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.648598 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.702095 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-798745f775-7dmwn"] Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.708530 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-internal-tls-certs\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.711305 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-config-data\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.711421 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-combined-ca-bundle\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.720772 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-logs\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.723559 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-public-tls-certs\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.723645 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-scripts\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.724012 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kdgl\" (UniqueName: \"kubernetes.io/projected/d7f68eff-0e73-43ec-bb9a-97fd321b92ec-kube-api-access-5kdgl\") pod \"placement-7bf98548b6-68m92\" (UID: \"d7f68eff-0e73-43ec-bb9a-97fd321b92ec\") " pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.741910 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.741970 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.742007 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5t6\" (UniqueName: \"kubernetes.io/projected/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-kube-api-access-lc5t6\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.742034 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.742093 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-config-data\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.742118 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-scripts\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.742163 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-logs\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.808261 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.810263 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.814747 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.837018 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.843430 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.843481 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.843510 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.843567 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbbdh\" (UniqueName: \"kubernetes.io/projected/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-kube-api-access-fbbdh\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.843611 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-config-data\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.843635 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-scripts\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.843658 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.843677 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-logs\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.843712 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-logs\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.843734 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.843986 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.844015 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.844039 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.844073 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5t6\" (UniqueName: \"kubernetes.io/projected/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-kube-api-access-lc5t6\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.844711 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.945443 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-logs\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.945497 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.945554 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.945639 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.945663 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.945698 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbbdh\" (UniqueName: \"kubernetes.io/projected/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-kube-api-access-fbbdh\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.945739 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.945920 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:33 crc kubenswrapper[4693]: I1125 12:27:33.978512 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.029009 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-logs\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.029558 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.033740 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-logs\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.033842 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.036239 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.037493 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.037894 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbbdh\" (UniqueName: \"kubernetes.io/projected/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-kube-api-access-fbbdh\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.037905 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-scripts\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.039830 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.040211 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-config-data\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.044137 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.044403 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.060016 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5t6\" (UniqueName: \"kubernetes.io/projected/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-kube-api-access-lc5t6\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.075905 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.278194 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798745f775-7dmwn" event={"ID":"99307935-f048-43df-85fe-54ff4b925e7f","Type":"ContainerStarted","Data":"7c6b061fd4683543703ff1e465caaf13c2f356ade743b3cf47220cae20c5fdd5"} Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.309401 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.330349 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.539537 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7bf98548b6-68m92"] Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.811157 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:27:34 crc kubenswrapper[4693]: W1125 12:27:34.881297 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7f68eff_0e73_43ec_bb9a_97fd321b92ec.slice/crio-92d75c0153186f9c01aa26be33db7c853e988f935030944fdd53fa6a63830fcc WatchSource:0}: Error finding container 92d75c0153186f9c01aa26be33db7c853e988f935030944fdd53fa6a63830fcc: Status 404 returned error can't find the container with id 92d75c0153186f9c01aa26be33db7c853e988f935030944fdd53fa6a63830fcc Nov 25 12:27:34 crc kubenswrapper[4693]: W1125 12:27:34.883259 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf159c29e_ea1f_4a78_9462_e1f6e4779ddd.slice/crio-efadace4247428c6230a2518a7be4303c7b76247150ac00bae56227a338b44fc WatchSource:0}: Error finding container efadace4247428c6230a2518a7be4303c7b76247150ac00bae56227a338b44fc: Status 404 returned error can't find the container with id efadace4247428c6230a2518a7be4303c7b76247150ac00bae56227a338b44fc Nov 25 12:27:34 crc kubenswrapper[4693]: I1125 12:27:34.919758 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:27:35 crc kubenswrapper[4693]: I1125 12:27:35.113288 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:27:35 crc kubenswrapper[4693]: I1125 12:27:35.113669 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:27:35 crc kubenswrapper[4693]: I1125 12:27:35.290547 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f159c29e-ea1f-4a78-9462-e1f6e4779ddd","Type":"ContainerStarted","Data":"efadace4247428c6230a2518a7be4303c7b76247150ac00bae56227a338b44fc"} Nov 25 12:27:35 crc kubenswrapper[4693]: I1125 12:27:35.292728 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bf98548b6-68m92" event={"ID":"d7f68eff-0e73-43ec-bb9a-97fd321b92ec","Type":"ContainerStarted","Data":"92d75c0153186f9c01aa26be33db7c853e988f935030944fdd53fa6a63830fcc"} Nov 25 12:27:35 crc kubenswrapper[4693]: I1125 12:27:35.295887 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69","Type":"ContainerStarted","Data":"470efd6ccf70a9f0c76eea2fe257884ee853130cb004cbce83ffa9f39b1228f0"} Nov 25 12:27:35 crc kubenswrapper[4693]: I1125 12:27:35.436744 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:27:35 crc kubenswrapper[4693]: I1125 12:27:35.492272 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:27:36 crc kubenswrapper[4693]: I1125 12:27:36.306900 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bf98548b6-68m92" event={"ID":"d7f68eff-0e73-43ec-bb9a-97fd321b92ec","Type":"ContainerStarted","Data":"5181ae19c9979085fe4b835f04edae1c9224e44ac4fb1f9febc00d2bbfc6c623"} Nov 25 12:27:36 crc kubenswrapper[4693]: I1125 12:27:36.308668 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798745f775-7dmwn" event={"ID":"99307935-f048-43df-85fe-54ff4b925e7f","Type":"ContainerStarted","Data":"4ca9bc02663449be764071ce2199e8fb130170edeaa2f16bbb4f74a6a1340b81"} Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.323610 4693 generic.go:334] "Generic (PLEG): container finished" podID="99307935-f048-43df-85fe-54ff4b925e7f" containerID="4ca9bc02663449be764071ce2199e8fb130170edeaa2f16bbb4f74a6a1340b81" exitCode=0 Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.323792 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798745f775-7dmwn" event={"ID":"99307935-f048-43df-85fe-54ff4b925e7f","Type":"ContainerDied","Data":"4ca9bc02663449be764071ce2199e8fb130170edeaa2f16bbb4f74a6a1340b81"} Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.329097 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69","Type":"ContainerStarted","Data":"9514c03a52c75a9ad241fadcd1ab27ed270339735e4eff7afaf36a8c1dd2283c"} Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.329155 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69","Type":"ContainerStarted","Data":"757ce51c582b9914c42ee45d030b2afed731c9082c3c0394e7647af7a4f34f56"} Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.329208 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" containerName="glance-log" containerID="cri-o://757ce51c582b9914c42ee45d030b2afed731c9082c3c0394e7647af7a4f34f56" gracePeriod=30 Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.329283 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" containerName="glance-httpd" containerID="cri-o://9514c03a52c75a9ad241fadcd1ab27ed270339735e4eff7afaf36a8c1dd2283c" gracePeriod=30 Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.333841 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f159c29e-ea1f-4a78-9462-e1f6e4779ddd","Type":"ContainerStarted","Data":"2467ed0a84d33a0ff778d7f223d5711bdd3aeea03aa847d84ca940a9dac58702"} Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.333890 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f159c29e-ea1f-4a78-9462-e1f6e4779ddd","Type":"ContainerStarted","Data":"c036deb781bbbb3b193917c275f54f28a7a4a09fce48a49cbd2a31948a3e2f59"} Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.333922 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f159c29e-ea1f-4a78-9462-e1f6e4779ddd" containerName="glance-httpd" containerID="cri-o://2467ed0a84d33a0ff778d7f223d5711bdd3aeea03aa847d84ca940a9dac58702" gracePeriod=30 Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.333917 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f159c29e-ea1f-4a78-9462-e1f6e4779ddd" containerName="glance-log" containerID="cri-o://c036deb781bbbb3b193917c275f54f28a7a4a09fce48a49cbd2a31948a3e2f59" gracePeriod=30 Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.336244 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7bf98548b6-68m92" event={"ID":"d7f68eff-0e73-43ec-bb9a-97fd321b92ec","Type":"ContainerStarted","Data":"fb37e8a8685b98f66a07ec586d6aa948516dc3a6f21c04e0e4216e3eb3efdb9e"} Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.336590 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.336699 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.394098 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7bf98548b6-68m92" podStartSLOduration=4.394073654 podStartE2EDuration="4.394073654s" podCreationTimestamp="2025-11-25 12:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:37.384362048 +0000 UTC m=+1177.302447429" watchObservedRunningTime="2025-11-25 12:27:37.394073654 +0000 UTC m=+1177.312159035" Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.415505 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.41548369 podStartE2EDuration="5.41548369s" podCreationTimestamp="2025-11-25 12:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:37.405033544 +0000 UTC m=+1177.323118925" watchObservedRunningTime="2025-11-25 12:27:37.41548369 +0000 UTC m=+1177.333569071" Nov 25 12:27:37 crc kubenswrapper[4693]: I1125 12:27:37.429331 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.429310912 podStartE2EDuration="5.429310912s" podCreationTimestamp="2025-11-25 12:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:37.428017405 +0000 UTC m=+1177.346102796" watchObservedRunningTime="2025-11-25 12:27:37.429310912 +0000 UTC m=+1177.347396293" Nov 25 12:27:38 crc kubenswrapper[4693]: I1125 12:27:38.360443 4693 generic.go:334] "Generic (PLEG): container finished" podID="f159c29e-ea1f-4a78-9462-e1f6e4779ddd" containerID="2467ed0a84d33a0ff778d7f223d5711bdd3aeea03aa847d84ca940a9dac58702" exitCode=143 Nov 25 12:27:38 crc kubenswrapper[4693]: I1125 12:27:38.360773 4693 generic.go:334] "Generic (PLEG): container finished" podID="f159c29e-ea1f-4a78-9462-e1f6e4779ddd" containerID="c036deb781bbbb3b193917c275f54f28a7a4a09fce48a49cbd2a31948a3e2f59" exitCode=143 Nov 25 12:27:38 crc kubenswrapper[4693]: I1125 12:27:38.360592 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f159c29e-ea1f-4a78-9462-e1f6e4779ddd","Type":"ContainerDied","Data":"2467ed0a84d33a0ff778d7f223d5711bdd3aeea03aa847d84ca940a9dac58702"} Nov 25 12:27:38 crc kubenswrapper[4693]: I1125 12:27:38.360933 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f159c29e-ea1f-4a78-9462-e1f6e4779ddd","Type":"ContainerDied","Data":"c036deb781bbbb3b193917c275f54f28a7a4a09fce48a49cbd2a31948a3e2f59"} Nov 25 12:27:38 crc kubenswrapper[4693]: I1125 12:27:38.363131 4693 generic.go:334] "Generic (PLEG): container finished" podID="4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" containerID="9514c03a52c75a9ad241fadcd1ab27ed270339735e4eff7afaf36a8c1dd2283c" exitCode=0 Nov 25 12:27:38 crc kubenswrapper[4693]: I1125 12:27:38.363159 4693 generic.go:334] "Generic (PLEG): container finished" podID="4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" containerID="757ce51c582b9914c42ee45d030b2afed731c9082c3c0394e7647af7a4f34f56" exitCode=143 Nov 25 12:27:38 crc kubenswrapper[4693]: I1125 12:27:38.363193 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69","Type":"ContainerDied","Data":"9514c03a52c75a9ad241fadcd1ab27ed270339735e4eff7afaf36a8c1dd2283c"} Nov 25 12:27:38 crc kubenswrapper[4693]: I1125 12:27:38.363241 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69","Type":"ContainerDied","Data":"757ce51c582b9914c42ee45d030b2afed731c9082c3c0394e7647af7a4f34f56"} Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.568801 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7bbcbd4584-78jln" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.759330 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.818763 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 12:27:42 crc kubenswrapper[4693]: E1125 12:27:42.895960 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.921826 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc5t6\" (UniqueName: \"kubernetes.io/projected/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-kube-api-access-lc5t6\") pod \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.921901 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-scripts\") pod \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.921957 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-logs\") pod \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.921979 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-combined-ca-bundle\") pod \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.922017 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-httpd-run\") pod \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.922049 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.922106 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-scripts\") pod \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.922127 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbbdh\" (UniqueName: \"kubernetes.io/projected/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-kube-api-access-fbbdh\") pod \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.922162 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-combined-ca-bundle\") pod \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.922228 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-httpd-run\") pod \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.922259 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-config-data\") pod \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\" (UID: \"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.922293 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-config-data\") pod \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.922320 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-logs\") pod \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.922414 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\" (UID: \"f159c29e-ea1f-4a78-9462-e1f6e4779ddd\") " Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.924618 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-logs" (OuterVolumeSpecName: "logs") pod "f159c29e-ea1f-4a78-9462-e1f6e4779ddd" (UID: "f159c29e-ea1f-4a78-9462-e1f6e4779ddd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.924780 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f159c29e-ea1f-4a78-9462-e1f6e4779ddd" (UID: "f159c29e-ea1f-4a78-9462-e1f6e4779ddd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.925302 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" (UID: "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.925920 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-logs" (OuterVolumeSpecName: "logs") pod "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" (UID: "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.931658 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "f159c29e-ea1f-4a78-9462-e1f6e4779ddd" (UID: "f159c29e-ea1f-4a78-9462-e1f6e4779ddd"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.931837 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-kube-api-access-lc5t6" (OuterVolumeSpecName: "kube-api-access-lc5t6") pod "f159c29e-ea1f-4a78-9462-e1f6e4779ddd" (UID: "f159c29e-ea1f-4a78-9462-e1f6e4779ddd"). InnerVolumeSpecName "kube-api-access-lc5t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.945629 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-scripts" (OuterVolumeSpecName: "scripts") pod "f159c29e-ea1f-4a78-9462-e1f6e4779ddd" (UID: "f159c29e-ea1f-4a78-9462-e1f6e4779ddd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.948597 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-scripts" (OuterVolumeSpecName: "scripts") pod "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" (UID: "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.955553 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-kube-api-access-fbbdh" (OuterVolumeSpecName: "kube-api-access-fbbdh") pod "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" (UID: "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69"). InnerVolumeSpecName "kube-api-access-fbbdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.961278 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f159c29e-ea1f-4a78-9462-e1f6e4779ddd" (UID: "f159c29e-ea1f-4a78-9462-e1f6e4779ddd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.962607 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" (UID: "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.990249 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" (UID: "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.994878 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-config-data" (OuterVolumeSpecName: "config-data") pod "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" (UID: "4df8c9f9-e06d-4f4a-8358-5cfa125c4a69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:42 crc kubenswrapper[4693]: I1125 12:27:42.996955 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-config-data" (OuterVolumeSpecName: "config-data") pod "f159c29e-ea1f-4a78-9462-e1f6e4779ddd" (UID: "f159c29e-ea1f-4a78-9462-e1f6e4779ddd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023806 4693 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023833 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023842 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023851 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023883 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023893 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc5t6\" (UniqueName: \"kubernetes.io/projected/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-kube-api-access-lc5t6\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023901 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023911 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023919 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023926 4693 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023940 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023947 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023955 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbbdh\" (UniqueName: \"kubernetes.io/projected/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69-kube-api-access-fbbdh\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.023963 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f159c29e-ea1f-4a78-9462-e1f6e4779ddd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.040670 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.048888 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.125992 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.126029 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.412362 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f159c29e-ea1f-4a78-9462-e1f6e4779ddd","Type":"ContainerDied","Data":"efadace4247428c6230a2518a7be4303c7b76247150ac00bae56227a338b44fc"} Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.412427 4693 scope.go:117] "RemoveContainer" containerID="2467ed0a84d33a0ff778d7f223d5711bdd3aeea03aa847d84ca940a9dac58702" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.412534 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.417603 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798745f775-7dmwn" event={"ID":"99307935-f048-43df-85fe-54ff4b925e7f","Type":"ContainerStarted","Data":"6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0"} Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.418320 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.426740 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wqltl" event={"ID":"7e344532-aeaf-4acf-9d1c-ebc0290e406e","Type":"ContainerStarted","Data":"065a856844c25d7a4c0e2ee8e0d95238d6749fa92e4591e83774bc825588066c"} Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.429138 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kqn9c" event={"ID":"fd001ffc-9a83-408f-bc46-9a7cacf052c7","Type":"ContainerStarted","Data":"a54d415c0cd66def5d5567c9cf0aac4246321420e6dd522d9247dcb5e9c9be6c"} Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.433993 4693 scope.go:117] "RemoveContainer" containerID="c036deb781bbbb3b193917c275f54f28a7a4a09fce48a49cbd2a31948a3e2f59" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.440110 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a41873-5fe3-4e4d-9a0c-556e6c85919d","Type":"ContainerStarted","Data":"89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2"} Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.440259 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerName="ceilometer-notification-agent" containerID="cri-o://5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4" gracePeriod=30 Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.440296 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.440342 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerName="sg-core" containerID="cri-o://d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6" gracePeriod=30 Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.440351 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerName="proxy-httpd" containerID="cri-o://89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2" gracePeriod=30 Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.448715 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4df8c9f9-e06d-4f4a-8358-5cfa125c4a69","Type":"ContainerDied","Data":"470efd6ccf70a9f0c76eea2fe257884ee853130cb004cbce83ffa9f39b1228f0"} Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.448826 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.453287 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-798745f775-7dmwn" podStartSLOduration=11.453266899 podStartE2EDuration="11.453266899s" podCreationTimestamp="2025-11-25 12:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:43.439804778 +0000 UTC m=+1183.357890159" watchObservedRunningTime="2025-11-25 12:27:43.453266899 +0000 UTC m=+1183.371352280" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.471320 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wqltl" podStartSLOduration=2.847512574 podStartE2EDuration="1m0.4712969s" podCreationTimestamp="2025-11-25 12:26:43 +0000 UTC" firstStartedPulling="2025-11-25 12:26:44.971101462 +0000 UTC m=+1124.889186843" lastFinishedPulling="2025-11-25 12:27:42.594885778 +0000 UTC m=+1182.512971169" observedRunningTime="2025-11-25 12:27:43.460883455 +0000 UTC m=+1183.378968836" watchObservedRunningTime="2025-11-25 12:27:43.4712969 +0000 UTC m=+1183.389382291" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.478583 4693 scope.go:117] "RemoveContainer" containerID="9514c03a52c75a9ad241fadcd1ab27ed270339735e4eff7afaf36a8c1dd2283c" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.482711 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.487418 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.505535 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:27:43 crc kubenswrapper[4693]: E1125 12:27:43.506021 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f159c29e-ea1f-4a78-9462-e1f6e4779ddd" containerName="glance-log" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.506087 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f159c29e-ea1f-4a78-9462-e1f6e4779ddd" containerName="glance-log" Nov 25 12:27:43 crc kubenswrapper[4693]: E1125 12:27:43.506142 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" containerName="glance-httpd" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.506187 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" containerName="glance-httpd" Nov 25 12:27:43 crc kubenswrapper[4693]: E1125 12:27:43.506247 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" containerName="glance-log" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.506298 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" containerName="glance-log" Nov 25 12:27:43 crc kubenswrapper[4693]: E1125 12:27:43.506365 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f159c29e-ea1f-4a78-9462-e1f6e4779ddd" containerName="glance-httpd" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.506436 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f159c29e-ea1f-4a78-9462-e1f6e4779ddd" containerName="glance-httpd" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.506643 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f159c29e-ea1f-4a78-9462-e1f6e4779ddd" containerName="glance-httpd" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.506715 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" containerName="glance-httpd" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.506775 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" containerName="glance-log" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.506833 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f159c29e-ea1f-4a78-9462-e1f6e4779ddd" containerName="glance-log" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.507938 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.511366 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-q2hwj" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.511703 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.512247 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.512388 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.519891 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-kqn9c" podStartSLOduration=3.892709934 podStartE2EDuration="1m1.519868496s" podCreationTimestamp="2025-11-25 12:26:42 +0000 UTC" firstStartedPulling="2025-11-25 12:26:44.965520143 +0000 UTC m=+1124.883605524" lastFinishedPulling="2025-11-25 12:27:42.592678705 +0000 UTC m=+1182.510764086" observedRunningTime="2025-11-25 12:27:43.51188219 +0000 UTC m=+1183.429967571" watchObservedRunningTime="2025-11-25 12:27:43.519868496 +0000 UTC m=+1183.437953877" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.537705 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.568893 4693 scope.go:117] "RemoveContainer" containerID="757ce51c582b9914c42ee45d030b2afed731c9082c3c0394e7647af7a4f34f56" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.640654 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a527ec6b-b211-43e1-afd2-6cfd2d60291a-logs\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.640738 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a527ec6b-b211-43e1-afd2-6cfd2d60291a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.640769 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-scripts\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.640791 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.640808 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-config-data\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.640840 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9xv7\" (UniqueName: \"kubernetes.io/projected/a527ec6b-b211-43e1-afd2-6cfd2d60291a-kube-api-access-h9xv7\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.640859 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.640913 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.654721 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-574fd6fdfd-bz6sm" podUID="14ff5a36-1912-43a8-b87f-57a6858a5799" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.679412 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.690160 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.698777 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.700089 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.704763 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.704900 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.707929 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.743473 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a527ec6b-b211-43e1-afd2-6cfd2d60291a-logs\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.743591 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a527ec6b-b211-43e1-afd2-6cfd2d60291a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.743635 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-scripts\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.743663 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.743688 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-config-data\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.743733 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9xv7\" (UniqueName: \"kubernetes.io/projected/a527ec6b-b211-43e1-afd2-6cfd2d60291a-kube-api-access-h9xv7\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.743759 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.743823 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.743868 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a527ec6b-b211-43e1-afd2-6cfd2d60291a-logs\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.743901 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a527ec6b-b211-43e1-afd2-6cfd2d60291a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.744090 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.749300 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-config-data\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.750475 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.753095 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-scripts\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.757639 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.762089 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9xv7\" (UniqueName: \"kubernetes.io/projected/a527ec6b-b211-43e1-afd2-6cfd2d60291a-kube-api-access-h9xv7\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.775037 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.845334 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3386396-6766-42b0-a683-af6f6c2da021-logs\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.845450 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.845493 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.845526 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.845711 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmntp\" (UniqueName: \"kubernetes.io/projected/a3386396-6766-42b0-a683-af6f6c2da021-kube-api-access-gmntp\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.845774 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.845804 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3386396-6766-42b0-a683-af6f6c2da021-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.845827 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.912626 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.947740 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmntp\" (UniqueName: \"kubernetes.io/projected/a3386396-6766-42b0-a683-af6f6c2da021-kube-api-access-gmntp\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.947810 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.947847 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3386396-6766-42b0-a683-af6f6c2da021-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.947881 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.947963 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3386396-6766-42b0-a683-af6f6c2da021-logs\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.947989 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.948015 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.948039 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.948197 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.952751 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3386396-6766-42b0-a683-af6f6c2da021-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.953577 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.955672 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3386396-6766-42b0-a683-af6f6c2da021-logs\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.957856 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.968745 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.969925 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmntp\" (UniqueName: \"kubernetes.io/projected/a3386396-6766-42b0-a683-af6f6c2da021-kube-api-access-gmntp\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:43 crc kubenswrapper[4693]: I1125 12:27:43.972669 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:44 crc kubenswrapper[4693]: I1125 12:27:44.006173 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:27:44 crc kubenswrapper[4693]: I1125 12:27:44.022073 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:44 crc kubenswrapper[4693]: I1125 12:27:44.462984 4693 generic.go:334] "Generic (PLEG): container finished" podID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerID="89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2" exitCode=0 Nov 25 12:27:44 crc kubenswrapper[4693]: I1125 12:27:44.463033 4693 generic.go:334] "Generic (PLEG): container finished" podID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerID="d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6" exitCode=2 Nov 25 12:27:44 crc kubenswrapper[4693]: I1125 12:27:44.463072 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a41873-5fe3-4e4d-9a0c-556e6c85919d","Type":"ContainerDied","Data":"89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2"} Nov 25 12:27:44 crc kubenswrapper[4693]: I1125 12:27:44.463117 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a41873-5fe3-4e4d-9a0c-556e6c85919d","Type":"ContainerDied","Data":"d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6"} Nov 25 12:27:44 crc kubenswrapper[4693]: I1125 12:27:44.464798 4693 generic.go:334] "Generic (PLEG): container finished" podID="6340c0be-f12e-4dac-908a-480c7ed0e1e8" containerID="f27fcc7bf6f370050ccc53520f9ced274f34d5f7fa6e8acf1c73bc6da3c1a826" exitCode=0 Nov 25 12:27:44 crc kubenswrapper[4693]: I1125 12:27:44.464833 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-68nxm" event={"ID":"6340c0be-f12e-4dac-908a-480c7ed0e1e8","Type":"ContainerDied","Data":"f27fcc7bf6f370050ccc53520f9ced274f34d5f7fa6e8acf1c73bc6da3c1a826"} Nov 25 12:27:44 crc kubenswrapper[4693]: W1125 12:27:44.489747 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda527ec6b_b211_43e1_afd2_6cfd2d60291a.slice/crio-04959e2365e811a09859ef558735122dfd01b478e0d2b21ba289f4001033c099 WatchSource:0}: Error finding container 04959e2365e811a09859ef558735122dfd01b478e0d2b21ba289f4001033c099: Status 404 returned error can't find the container with id 04959e2365e811a09859ef558735122dfd01b478e0d2b21ba289f4001033c099 Nov 25 12:27:44 crc kubenswrapper[4693]: I1125 12:27:44.502542 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:27:44 crc kubenswrapper[4693]: I1125 12:27:44.622244 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:27:44 crc kubenswrapper[4693]: W1125 12:27:44.631958 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3386396_6766_42b0_a683_af6f6c2da021.slice/crio-11456f80e3966d77698a18e5e794768531126c46a88cecfed810e831392344ae WatchSource:0}: Error finding container 11456f80e3966d77698a18e5e794768531126c46a88cecfed810e831392344ae: Status 404 returned error can't find the container with id 11456f80e3966d77698a18e5e794768531126c46a88cecfed810e831392344ae Nov 25 12:27:44 crc kubenswrapper[4693]: I1125 12:27:44.826655 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4df8c9f9-e06d-4f4a-8358-5cfa125c4a69" path="/var/lib/kubelet/pods/4df8c9f9-e06d-4f4a-8358-5cfa125c4a69/volumes" Nov 25 12:27:44 crc kubenswrapper[4693]: I1125 12:27:44.828103 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f159c29e-ea1f-4a78-9462-e1f6e4779ddd" path="/var/lib/kubelet/pods/f159c29e-ea1f-4a78-9462-e1f6e4779ddd/volumes" Nov 25 12:27:45 crc kubenswrapper[4693]: I1125 12:27:45.487154 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a527ec6b-b211-43e1-afd2-6cfd2d60291a","Type":"ContainerStarted","Data":"95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f"} Nov 25 12:27:45 crc kubenswrapper[4693]: I1125 12:27:45.487460 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a527ec6b-b211-43e1-afd2-6cfd2d60291a","Type":"ContainerStarted","Data":"04959e2365e811a09859ef558735122dfd01b478e0d2b21ba289f4001033c099"} Nov 25 12:27:45 crc kubenswrapper[4693]: I1125 12:27:45.488529 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a3386396-6766-42b0-a683-af6f6c2da021","Type":"ContainerStarted","Data":"b4f5419c71b94019c4a7731f744eac9712e45c8cde2a446e4cffbfba4d3efbbb"} Nov 25 12:27:45 crc kubenswrapper[4693]: I1125 12:27:45.488560 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a3386396-6766-42b0-a683-af6f6c2da021","Type":"ContainerStarted","Data":"11456f80e3966d77698a18e5e794768531126c46a88cecfed810e831392344ae"} Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.022206 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-68nxm" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.187101 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6340c0be-f12e-4dac-908a-480c7ed0e1e8-combined-ca-bundle\") pod \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\" (UID: \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\") " Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.187643 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6340c0be-f12e-4dac-908a-480c7ed0e1e8-config\") pod \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\" (UID: \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\") " Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.187688 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvjj9\" (UniqueName: \"kubernetes.io/projected/6340c0be-f12e-4dac-908a-480c7ed0e1e8-kube-api-access-mvjj9\") pod \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\" (UID: \"6340c0be-f12e-4dac-908a-480c7ed0e1e8\") " Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.197631 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6340c0be-f12e-4dac-908a-480c7ed0e1e8-kube-api-access-mvjj9" (OuterVolumeSpecName: "kube-api-access-mvjj9") pod "6340c0be-f12e-4dac-908a-480c7ed0e1e8" (UID: "6340c0be-f12e-4dac-908a-480c7ed0e1e8"). InnerVolumeSpecName "kube-api-access-mvjj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.216629 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6340c0be-f12e-4dac-908a-480c7ed0e1e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6340c0be-f12e-4dac-908a-480c7ed0e1e8" (UID: "6340c0be-f12e-4dac-908a-480c7ed0e1e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.218450 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6340c0be-f12e-4dac-908a-480c7ed0e1e8-config" (OuterVolumeSpecName: "config") pod "6340c0be-f12e-4dac-908a-480c7ed0e1e8" (UID: "6340c0be-f12e-4dac-908a-480c7ed0e1e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.269903 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.292619 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6340c0be-f12e-4dac-908a-480c7ed0e1e8-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.292666 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvjj9\" (UniqueName: \"kubernetes.io/projected/6340c0be-f12e-4dac-908a-480c7ed0e1e8-kube-api-access-mvjj9\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.292681 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6340c0be-f12e-4dac-908a-480c7ed0e1e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.395021 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-log-httpd\") pod \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.395132 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-config-data\") pod \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.395236 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-run-httpd\") pod \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.395291 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-scripts\") pod \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.395339 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mbsb\" (UniqueName: \"kubernetes.io/projected/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-kube-api-access-7mbsb\") pod \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.395462 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-sg-core-conf-yaml\") pod \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.395606 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a0a41873-5fe3-4e4d-9a0c-556e6c85919d" (UID: "a0a41873-5fe3-4e4d-9a0c-556e6c85919d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.396134 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-combined-ca-bundle\") pod \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\" (UID: \"a0a41873-5fe3-4e4d-9a0c-556e6c85919d\") " Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.396842 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.399435 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-scripts" (OuterVolumeSpecName: "scripts") pod "a0a41873-5fe3-4e4d-9a0c-556e6c85919d" (UID: "a0a41873-5fe3-4e4d-9a0c-556e6c85919d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.400279 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-kube-api-access-7mbsb" (OuterVolumeSpecName: "kube-api-access-7mbsb") pod "a0a41873-5fe3-4e4d-9a0c-556e6c85919d" (UID: "a0a41873-5fe3-4e4d-9a0c-556e6c85919d"). InnerVolumeSpecName "kube-api-access-7mbsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.405126 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a0a41873-5fe3-4e4d-9a0c-556e6c85919d" (UID: "a0a41873-5fe3-4e4d-9a0c-556e6c85919d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.424594 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a0a41873-5fe3-4e4d-9a0c-556e6c85919d" (UID: "a0a41873-5fe3-4e4d-9a0c-556e6c85919d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.444829 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0a41873-5fe3-4e4d-9a0c-556e6c85919d" (UID: "a0a41873-5fe3-4e4d-9a0c-556e6c85919d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.460584 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-config-data" (OuterVolumeSpecName: "config-data") pod "a0a41873-5fe3-4e4d-9a0c-556e6c85919d" (UID: "a0a41873-5fe3-4e4d-9a0c-556e6c85919d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.498122 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-68nxm" event={"ID":"6340c0be-f12e-4dac-908a-480c7ed0e1e8","Type":"ContainerDied","Data":"a3f51d2e9004a9fcbb50f357e8f16a140fabb016b12e5087568895e434908bbf"} Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.498160 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3f51d2e9004a9fcbb50f357e8f16a140fabb016b12e5087568895e434908bbf" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.498215 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-68nxm" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.501741 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.501775 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.501789 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.501802 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mbsb\" (UniqueName: \"kubernetes.io/projected/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-kube-api-access-7mbsb\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.501815 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.501826 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a41873-5fe3-4e4d-9a0c-556e6c85919d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.503192 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a3386396-6766-42b0-a683-af6f6c2da021","Type":"ContainerStarted","Data":"06de4c38e61999a12ce2fcb3bb3173374752e4183c7a3e7c975130c1e666ad25"} Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.507703 4693 generic.go:334] "Generic (PLEG): container finished" podID="7e344532-aeaf-4acf-9d1c-ebc0290e406e" containerID="065a856844c25d7a4c0e2ee8e0d95238d6749fa92e4591e83774bc825588066c" exitCode=0 Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.507804 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wqltl" event={"ID":"7e344532-aeaf-4acf-9d1c-ebc0290e406e","Type":"ContainerDied","Data":"065a856844c25d7a4c0e2ee8e0d95238d6749fa92e4591e83774bc825588066c"} Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.511405 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a527ec6b-b211-43e1-afd2-6cfd2d60291a","Type":"ContainerStarted","Data":"09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75"} Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.517928 4693 generic.go:334] "Generic (PLEG): container finished" podID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerID="5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4" exitCode=0 Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.517984 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a41873-5fe3-4e4d-9a0c-556e6c85919d","Type":"ContainerDied","Data":"5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4"} Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.518002 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.518048 4693 scope.go:117] "RemoveContainer" containerID="89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.518035 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0a41873-5fe3-4e4d-9a0c-556e6c85919d","Type":"ContainerDied","Data":"ba1d20e04b6de6e5c625328ba0d87d54c8292f30776bc738678e90558665f7b4"} Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.546638 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.546613448 podStartE2EDuration="3.546613448s" podCreationTimestamp="2025-11-25 12:27:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:46.538399195 +0000 UTC m=+1186.456484586" watchObservedRunningTime="2025-11-25 12:27:46.546613448 +0000 UTC m=+1186.464698829" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.550029 4693 scope.go:117] "RemoveContainer" containerID="d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.584049 4693 scope.go:117] "RemoveContainer" containerID="5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.619895 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.619878034 podStartE2EDuration="3.619878034s" podCreationTimestamp="2025-11-25 12:27:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:46.618857426 +0000 UTC m=+1186.536942817" watchObservedRunningTime="2025-11-25 12:27:46.619878034 +0000 UTC m=+1186.537963415" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.673270 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.679604 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.700430 4693 scope.go:117] "RemoveContainer" containerID="89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2" Nov 25 12:27:46 crc kubenswrapper[4693]: E1125 12:27:46.711431 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2\": container with ID starting with 89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2 not found: ID does not exist" containerID="89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.711698 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2"} err="failed to get container status \"89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2\": rpc error: code = NotFound desc = could not find container \"89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2\": container with ID starting with 89f37350d609643bca0d1a0f90097b6f42da9b4c2444cb90caf590b1336c6ac2 not found: ID does not exist" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.711811 4693 scope.go:117] "RemoveContainer" containerID="d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6" Nov 25 12:27:46 crc kubenswrapper[4693]: E1125 12:27:46.714530 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6\": container with ID starting with d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6 not found: ID does not exist" containerID="d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.714778 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6"} err="failed to get container status \"d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6\": rpc error: code = NotFound desc = could not find container \"d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6\": container with ID starting with d7f7ea23ee559a794620a326b28f3ac5718e70f31db96381a81c2061d639bce6 not found: ID does not exist" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.714872 4693 scope.go:117] "RemoveContainer" containerID="5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4" Nov 25 12:27:46 crc kubenswrapper[4693]: E1125 12:27:46.715530 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4\": container with ID starting with 5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4 not found: ID does not exist" containerID="5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.715652 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4"} err="failed to get container status \"5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4\": rpc error: code = NotFound desc = could not find container \"5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4\": container with ID starting with 5123ead9c59572d913b03236892a8508b7d0f2f776a09c58f794acc7b8cd91b4 not found: ID does not exist" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.724020 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-798745f775-7dmwn"] Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.724280 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-798745f775-7dmwn" podUID="99307935-f048-43df-85fe-54ff4b925e7f" containerName="dnsmasq-dns" containerID="cri-o://6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0" gracePeriod=10 Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.755649 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:27:46 crc kubenswrapper[4693]: E1125 12:27:46.756045 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6340c0be-f12e-4dac-908a-480c7ed0e1e8" containerName="neutron-db-sync" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.756066 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6340c0be-f12e-4dac-908a-480c7ed0e1e8" containerName="neutron-db-sync" Nov 25 12:27:46 crc kubenswrapper[4693]: E1125 12:27:46.756088 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerName="ceilometer-notification-agent" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.756095 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerName="ceilometer-notification-agent" Nov 25 12:27:46 crc kubenswrapper[4693]: E1125 12:27:46.756110 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerName="sg-core" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.756118 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerName="sg-core" Nov 25 12:27:46 crc kubenswrapper[4693]: E1125 12:27:46.756127 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerName="proxy-httpd" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.756133 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerName="proxy-httpd" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.756284 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6340c0be-f12e-4dac-908a-480c7ed0e1e8" containerName="neutron-db-sync" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.756293 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerName="proxy-httpd" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.756315 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerName="ceilometer-notification-agent" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.756328 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" containerName="sg-core" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.757913 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.766169 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.766489 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.785494 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.797803 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b95cfcf9c-rdbk5"] Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.799894 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.817479 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.817731 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2m6h\" (UniqueName: \"kubernetes.io/projected/95762039-b403-4702-9c63-025018a0d833-kube-api-access-l2m6h\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.817827 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-scripts\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.817985 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95762039-b403-4702-9c63-025018a0d833-run-httpd\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.818090 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.818173 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-config-data\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.818254 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95762039-b403-4702-9c63-025018a0d833-log-httpd\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.832161 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a41873-5fe3-4e4d-9a0c-556e6c85919d" path="/var/lib/kubelet/pods/a0a41873-5fe3-4e4d-9a0c-556e6c85919d/volumes" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.837997 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b95cfcf9c-rdbk5"] Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.918329 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5545f6bb6d-vdngh"] Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.919585 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-config\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.922905 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.923003 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.923011 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2m6h\" (UniqueName: \"kubernetes.io/projected/95762039-b403-4702-9c63-025018a0d833-kube-api-access-l2m6h\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.923065 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-ovsdbserver-sb\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.923097 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-scripts\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.923143 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-dns-swift-storage-0\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.923254 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrr48\" (UniqueName: \"kubernetes.io/projected/d3bd3075-1211-402c-9e19-a8057ee182ea-kube-api-access-zrr48\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.923308 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95762039-b403-4702-9c63-025018a0d833-run-httpd\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.923350 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-dns-svc\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.923425 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.923449 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-config-data\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.923469 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95762039-b403-4702-9c63-025018a0d833-log-httpd\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.923493 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-ovsdbserver-nb\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.925704 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95762039-b403-4702-9c63-025018a0d833-run-httpd\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.926747 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95762039-b403-4702-9c63-025018a0d833-log-httpd\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.928473 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.929574 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-config-data\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.933130 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-scripts\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.934425 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.935088 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.935389 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hk688" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.937775 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.940414 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.950299 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5545f6bb6d-vdngh"] Nov 25 12:27:46 crc kubenswrapper[4693]: I1125 12:27:46.959275 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2m6h\" (UniqueName: \"kubernetes.io/projected/95762039-b403-4702-9c63-025018a0d833-kube-api-access-l2m6h\") pod \"ceilometer-0\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " pod="openstack/ceilometer-0" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.025616 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-ovsdbserver-nb\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.025705 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4d6n\" (UniqueName: \"kubernetes.io/projected/6921c173-adf8-47d6-9e9b-98657a453bdd-kube-api-access-x4d6n\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.025786 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-combined-ca-bundle\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.025813 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-httpd-config\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.025841 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-config\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.025901 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-ovsdbserver-sb\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.025926 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-config\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.025954 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-dns-swift-storage-0\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.026006 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-ovndb-tls-certs\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.026050 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrr48\" (UniqueName: \"kubernetes.io/projected/d3bd3075-1211-402c-9e19-a8057ee182ea-kube-api-access-zrr48\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.026086 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-dns-svc\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.027308 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-dns-svc\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.027416 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-dns-swift-storage-0\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.027530 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-ovsdbserver-nb\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.027904 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-config\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.028062 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-ovsdbserver-sb\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.050432 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrr48\" (UniqueName: \"kubernetes.io/projected/d3bd3075-1211-402c-9e19-a8057ee182ea-kube-api-access-zrr48\") pod \"dnsmasq-dns-5b95cfcf9c-rdbk5\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.102435 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.119533 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.127406 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4d6n\" (UniqueName: \"kubernetes.io/projected/6921c173-adf8-47d6-9e9b-98657a453bdd-kube-api-access-x4d6n\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.127522 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-combined-ca-bundle\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.127546 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-httpd-config\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.127613 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-config\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.127647 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-ovndb-tls-certs\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.134761 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-ovndb-tls-certs\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.135096 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-combined-ca-bundle\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.137861 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-httpd-config\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.141277 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-config\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.157083 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4d6n\" (UniqueName: \"kubernetes.io/projected/6921c173-adf8-47d6-9e9b-98657a453bdd-kube-api-access-x4d6n\") pod \"neutron-5545f6bb6d-vdngh\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.274711 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.332937 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2f64\" (UniqueName: \"kubernetes.io/projected/99307935-f048-43df-85fe-54ff4b925e7f-kube-api-access-l2f64\") pod \"99307935-f048-43df-85fe-54ff4b925e7f\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.333013 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-dns-swift-storage-0\") pod \"99307935-f048-43df-85fe-54ff4b925e7f\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.333065 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-ovsdbserver-nb\") pod \"99307935-f048-43df-85fe-54ff4b925e7f\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.333098 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-ovsdbserver-sb\") pod \"99307935-f048-43df-85fe-54ff4b925e7f\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.333194 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-dns-svc\") pod \"99307935-f048-43df-85fe-54ff4b925e7f\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.333261 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-config\") pod \"99307935-f048-43df-85fe-54ff4b925e7f\" (UID: \"99307935-f048-43df-85fe-54ff4b925e7f\") " Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.353127 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99307935-f048-43df-85fe-54ff4b925e7f-kube-api-access-l2f64" (OuterVolumeSpecName: "kube-api-access-l2f64") pod "99307935-f048-43df-85fe-54ff4b925e7f" (UID: "99307935-f048-43df-85fe-54ff4b925e7f"). InnerVolumeSpecName "kube-api-access-l2f64". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.398851 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "99307935-f048-43df-85fe-54ff4b925e7f" (UID: "99307935-f048-43df-85fe-54ff4b925e7f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.402824 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99307935-f048-43df-85fe-54ff4b925e7f" (UID: "99307935-f048-43df-85fe-54ff4b925e7f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.426406 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-config" (OuterVolumeSpecName: "config") pod "99307935-f048-43df-85fe-54ff4b925e7f" (UID: "99307935-f048-43df-85fe-54ff4b925e7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.427512 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.428737 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99307935-f048-43df-85fe-54ff4b925e7f" (UID: "99307935-f048-43df-85fe-54ff4b925e7f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.430395 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99307935-f048-43df-85fe-54ff4b925e7f" (UID: "99307935-f048-43df-85fe-54ff4b925e7f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.439435 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.439467 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2f64\" (UniqueName: \"kubernetes.io/projected/99307935-f048-43df-85fe-54ff4b925e7f-kube-api-access-l2f64\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.439481 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.439490 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.439502 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.439512 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99307935-f048-43df-85fe-54ff4b925e7f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.541473 4693 generic.go:334] "Generic (PLEG): container finished" podID="99307935-f048-43df-85fe-54ff4b925e7f" containerID="6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0" exitCode=0 Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.541541 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798745f775-7dmwn" event={"ID":"99307935-f048-43df-85fe-54ff4b925e7f","Type":"ContainerDied","Data":"6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0"} Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.541575 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798745f775-7dmwn" event={"ID":"99307935-f048-43df-85fe-54ff4b925e7f","Type":"ContainerDied","Data":"7c6b061fd4683543703ff1e465caaf13c2f356ade743b3cf47220cae20c5fdd5"} Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.541594 4693 scope.go:117] "RemoveContainer" containerID="6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.541748 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798745f775-7dmwn" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.585536 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-798745f775-7dmwn"] Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.601663 4693 scope.go:117] "RemoveContainer" containerID="4ca9bc02663449be764071ce2199e8fb130170edeaa2f16bbb4f74a6a1340b81" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.604096 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-798745f775-7dmwn"] Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.633035 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b95cfcf9c-rdbk5"] Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.648964 4693 scope.go:117] "RemoveContainer" containerID="6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0" Nov 25 12:27:47 crc kubenswrapper[4693]: E1125 12:27:47.650172 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0\": container with ID starting with 6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0 not found: ID does not exist" containerID="6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.650207 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0"} err="failed to get container status \"6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0\": rpc error: code = NotFound desc = could not find container \"6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0\": container with ID starting with 6cb9420203a329c83bace22223fe075a691db168a2a8e7a57d6682a128eb06e0 not found: ID does not exist" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.650234 4693 scope.go:117] "RemoveContainer" containerID="4ca9bc02663449be764071ce2199e8fb130170edeaa2f16bbb4f74a6a1340b81" Nov 25 12:27:47 crc kubenswrapper[4693]: E1125 12:27:47.652882 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca9bc02663449be764071ce2199e8fb130170edeaa2f16bbb4f74a6a1340b81\": container with ID starting with 4ca9bc02663449be764071ce2199e8fb130170edeaa2f16bbb4f74a6a1340b81 not found: ID does not exist" containerID="4ca9bc02663449be764071ce2199e8fb130170edeaa2f16bbb4f74a6a1340b81" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.652935 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca9bc02663449be764071ce2199e8fb130170edeaa2f16bbb4f74a6a1340b81"} err="failed to get container status \"4ca9bc02663449be764071ce2199e8fb130170edeaa2f16bbb4f74a6a1340b81\": rpc error: code = NotFound desc = could not find container \"4ca9bc02663449be764071ce2199e8fb130170edeaa2f16bbb4f74a6a1340b81\": container with ID starting with 4ca9bc02663449be764071ce2199e8fb130170edeaa2f16bbb4f74a6a1340b81 not found: ID does not exist" Nov 25 12:27:47 crc kubenswrapper[4693]: I1125 12:27:47.710608 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.027633 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wqltl" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.080835 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5545f6bb6d-vdngh"] Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.158043 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e344532-aeaf-4acf-9d1c-ebc0290e406e-db-sync-config-data\") pod \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\" (UID: \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\") " Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.158487 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp26g\" (UniqueName: \"kubernetes.io/projected/7e344532-aeaf-4acf-9d1c-ebc0290e406e-kube-api-access-sp26g\") pod \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\" (UID: \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\") " Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.158591 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e344532-aeaf-4acf-9d1c-ebc0290e406e-combined-ca-bundle\") pod \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\" (UID: \"7e344532-aeaf-4acf-9d1c-ebc0290e406e\") " Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.166195 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e344532-aeaf-4acf-9d1c-ebc0290e406e-kube-api-access-sp26g" (OuterVolumeSpecName: "kube-api-access-sp26g") pod "7e344532-aeaf-4acf-9d1c-ebc0290e406e" (UID: "7e344532-aeaf-4acf-9d1c-ebc0290e406e"). InnerVolumeSpecName "kube-api-access-sp26g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.181330 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e344532-aeaf-4acf-9d1c-ebc0290e406e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7e344532-aeaf-4acf-9d1c-ebc0290e406e" (UID: "7e344532-aeaf-4acf-9d1c-ebc0290e406e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.224521 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e344532-aeaf-4acf-9d1c-ebc0290e406e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e344532-aeaf-4acf-9d1c-ebc0290e406e" (UID: "7e344532-aeaf-4acf-9d1c-ebc0290e406e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.262099 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e344532-aeaf-4acf-9d1c-ebc0290e406e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.262135 4693 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7e344532-aeaf-4acf-9d1c-ebc0290e406e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.262144 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp26g\" (UniqueName: \"kubernetes.io/projected/7e344532-aeaf-4acf-9d1c-ebc0290e406e-kube-api-access-sp26g\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.578004 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wqltl" event={"ID":"7e344532-aeaf-4acf-9d1c-ebc0290e406e","Type":"ContainerDied","Data":"7f8e1fee741ea05a7ac2d90f027a194b7215b4c0f0483e26d1dfbcf608c25f42"} Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.578498 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f8e1fee741ea05a7ac2d90f027a194b7215b4c0f0483e26d1dfbcf608c25f42" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.578326 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wqltl" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.588145 4693 generic.go:334] "Generic (PLEG): container finished" podID="d3bd3075-1211-402c-9e19-a8057ee182ea" containerID="de4cdf52578656f0d42ce09af3914cfc0ce6a04f6fb3cd05a7d190aee226ac36" exitCode=0 Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.588306 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" event={"ID":"d3bd3075-1211-402c-9e19-a8057ee182ea","Type":"ContainerDied","Data":"de4cdf52578656f0d42ce09af3914cfc0ce6a04f6fb3cd05a7d190aee226ac36"} Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.589290 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" event={"ID":"d3bd3075-1211-402c-9e19-a8057ee182ea","Type":"ContainerStarted","Data":"e419e9dae978c7dc8ab0fe544a733f86ef285ccd5dacb96cee87298474f6f731"} Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.607616 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5545f6bb6d-vdngh" event={"ID":"6921c173-adf8-47d6-9e9b-98657a453bdd","Type":"ContainerStarted","Data":"bdb58e6825c07b11690e9e20ed28fd650708d53911c39bfd49c42098e1282447"} Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.607654 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5545f6bb6d-vdngh" event={"ID":"6921c173-adf8-47d6-9e9b-98657a453bdd","Type":"ContainerStarted","Data":"a5e1cedfcb4eb196cf8cacbccc23aae6033bf7273ba1de2cf3d0818143c00a2c"} Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.607664 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5545f6bb6d-vdngh" event={"ID":"6921c173-adf8-47d6-9e9b-98657a453bdd","Type":"ContainerStarted","Data":"fcf57349ab784191df0b0a745d9f2d2bab70f9882baa88d392bfd203b86a8bb6"} Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.610667 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.624610 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95762039-b403-4702-9c63-025018a0d833","Type":"ContainerStarted","Data":"0a07f378bdf5f11d549e1b6c7b6f53f411963a48b4407744f762aca7905fc9f4"} Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.624658 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95762039-b403-4702-9c63-025018a0d833","Type":"ContainerStarted","Data":"16990bd1384f409a258f9cf9079354c3a5e0f2c6159cac42a2e692094cf4da87"} Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.630923 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5545f6bb6d-vdngh" podStartSLOduration=2.630905016 podStartE2EDuration="2.630905016s" podCreationTimestamp="2025-11-25 12:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:48.63000495 +0000 UTC m=+1188.548090351" watchObservedRunningTime="2025-11-25 12:27:48.630905016 +0000 UTC m=+1188.548990397" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.736461 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7d9fc5569-wdqkp"] Nov 25 12:27:48 crc kubenswrapper[4693]: E1125 12:27:48.736825 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99307935-f048-43df-85fe-54ff4b925e7f" containerName="init" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.736842 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="99307935-f048-43df-85fe-54ff4b925e7f" containerName="init" Nov 25 12:27:48 crc kubenswrapper[4693]: E1125 12:27:48.736860 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e344532-aeaf-4acf-9d1c-ebc0290e406e" containerName="barbican-db-sync" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.736866 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e344532-aeaf-4acf-9d1c-ebc0290e406e" containerName="barbican-db-sync" Nov 25 12:27:48 crc kubenswrapper[4693]: E1125 12:27:48.736887 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99307935-f048-43df-85fe-54ff4b925e7f" containerName="dnsmasq-dns" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.736894 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="99307935-f048-43df-85fe-54ff4b925e7f" containerName="dnsmasq-dns" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.737059 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e344532-aeaf-4acf-9d1c-ebc0290e406e" containerName="barbican-db-sync" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.737070 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="99307935-f048-43df-85fe-54ff4b925e7f" containerName="dnsmasq-dns" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.737970 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.742196 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.742936 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.743206 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f2cgs" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.757458 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d9fc5569-wdqkp"] Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.770366 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/084144ce-d043-4dd8-bc4b-e904c42e47cd-config-data-custom\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.770650 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084144ce-d043-4dd8-bc4b-e904c42e47cd-combined-ca-bundle\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.770694 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99b8r\" (UniqueName: \"kubernetes.io/projected/084144ce-d043-4dd8-bc4b-e904c42e47cd-kube-api-access-99b8r\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.770719 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084144ce-d043-4dd8-bc4b-e904c42e47cd-config-data\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.770773 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084144ce-d043-4dd8-bc4b-e904c42e47cd-logs\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.790996 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7d949d856b-fbdcg"] Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.792589 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.797574 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.872519 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084144ce-d043-4dd8-bc4b-e904c42e47cd-combined-ca-bundle\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.872832 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99b8r\" (UniqueName: \"kubernetes.io/projected/084144ce-d043-4dd8-bc4b-e904c42e47cd-kube-api-access-99b8r\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.873014 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084144ce-d043-4dd8-bc4b-e904c42e47cd-config-data\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.873172 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084144ce-d043-4dd8-bc4b-e904c42e47cd-logs\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.873281 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99307935-f048-43df-85fe-54ff4b925e7f" path="/var/lib/kubelet/pods/99307935-f048-43df-85fe-54ff4b925e7f/volumes" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.874407 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/084144ce-d043-4dd8-bc4b-e904c42e47cd-config-data-custom\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.884406 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/084144ce-d043-4dd8-bc4b-e904c42e47cd-config-data-custom\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.884715 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/084144ce-d043-4dd8-bc4b-e904c42e47cd-logs\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.884781 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d949d856b-fbdcg"] Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.887889 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/084144ce-d043-4dd8-bc4b-e904c42e47cd-config-data\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.892626 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084144ce-d043-4dd8-bc4b-e904c42e47cd-combined-ca-bundle\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.900431 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b95cfcf9c-rdbk5"] Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.922445 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99b8r\" (UniqueName: \"kubernetes.io/projected/084144ce-d043-4dd8-bc4b-e904c42e47cd-kube-api-access-99b8r\") pod \"barbican-worker-7d9fc5569-wdqkp\" (UID: \"084144ce-d043-4dd8-bc4b-e904c42e47cd\") " pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.943105 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66b66f7449-hxwtk"] Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.944953 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.966550 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b66f7449-hxwtk"] Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.987642 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c743d467-4bdb-41ce-bf74-5051a93fc3d6-combined-ca-bundle\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.989415 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c743d467-4bdb-41ce-bf74-5051a93fc3d6-logs\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.989505 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c743d467-4bdb-41ce-bf74-5051a93fc3d6-config-data\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.989824 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c743d467-4bdb-41ce-bf74-5051a93fc3d6-config-data-custom\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.989878 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvlcn\" (UniqueName: \"kubernetes.io/projected/c743d467-4bdb-41ce-bf74-5051a93fc3d6-kube-api-access-cvlcn\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.991435 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-75fb97c486-4w76r"] Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.993884 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75fb97c486-4w76r"] Nov 25 12:27:48 crc kubenswrapper[4693]: I1125 12:27:48.994020 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:48.997740 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.052021 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57c497f557-r9sp7"] Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.053730 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.057493 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.061486 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.091159 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-config\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.091222 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-ovsdbserver-sb\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.091258 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q9sj\" (UniqueName: \"kubernetes.io/projected/d089afe1-1724-42ca-8204-355e98e903f2-kube-api-access-2q9sj\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.091319 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c743d467-4bdb-41ce-bf74-5051a93fc3d6-combined-ca-bundle\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.091344 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c743d467-4bdb-41ce-bf74-5051a93fc3d6-logs\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.091435 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-ovsdbserver-nb\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.091483 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c743d467-4bdb-41ce-bf74-5051a93fc3d6-config-data\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.091537 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-dns-svc\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.091572 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-dns-swift-storage-0\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.091617 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c743d467-4bdb-41ce-bf74-5051a93fc3d6-config-data-custom\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.091653 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlcn\" (UniqueName: \"kubernetes.io/projected/c743d467-4bdb-41ce-bf74-5051a93fc3d6-kube-api-access-cvlcn\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.091996 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c743d467-4bdb-41ce-bf74-5051a93fc3d6-logs\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.095631 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c743d467-4bdb-41ce-bf74-5051a93fc3d6-combined-ca-bundle\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.096712 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c743d467-4bdb-41ce-bf74-5051a93fc3d6-config-data-custom\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.098258 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c743d467-4bdb-41ce-bf74-5051a93fc3d6-config-data\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.108580 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57c497f557-r9sp7"] Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.110956 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvlcn\" (UniqueName: \"kubernetes.io/projected/c743d467-4bdb-41ce-bf74-5051a93fc3d6-kube-api-access-cvlcn\") pod \"barbican-keystone-listener-7d949d856b-fbdcg\" (UID: \"c743d467-4bdb-41ce-bf74-5051a93fc3d6\") " pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.166336 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7d9fc5569-wdqkp" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.197899 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1342b7d-3a59-4236-9673-f0b377a5657d-logs\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.198051 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-dns-svc\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.198129 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-public-tls-certs\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.200169 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-dns-svc\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.200310 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-combined-ca-bundle\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.200418 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-dns-swift-storage-0\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.200683 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-config\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.200819 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-ovsdbserver-sb\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.200940 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-config\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.200972 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm4k4\" (UniqueName: \"kubernetes.io/projected/d1342b7d-3a59-4236-9673-f0b377a5657d-kube-api-access-wm4k4\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.201094 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q9sj\" (UniqueName: \"kubernetes.io/projected/d089afe1-1724-42ca-8204-355e98e903f2-kube-api-access-2q9sj\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.201141 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-ovndb-tls-certs\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.201256 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-internal-tls-certs\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.201290 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnc9t\" (UniqueName: \"kubernetes.io/projected/66ccc10f-a153-4582-ab8d-f687b0c6bb20-kube-api-access-nnc9t\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.201427 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-config-data\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.201462 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-httpd-config\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.202114 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-config\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.202526 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-ovsdbserver-nb\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.202586 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-config-data-custom\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.202664 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-combined-ca-bundle\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.204436 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-ovsdbserver-nb\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.204914 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-ovsdbserver-sb\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.205293 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-dns-swift-storage-0\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.223632 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q9sj\" (UniqueName: \"kubernetes.io/projected/d089afe1-1724-42ca-8204-355e98e903f2-kube-api-access-2q9sj\") pod \"dnsmasq-dns-66b66f7449-hxwtk\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.280166 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.303970 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-config-data-custom\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.304029 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-combined-ca-bundle\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.304077 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1342b7d-3a59-4236-9673-f0b377a5657d-logs\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.304106 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-public-tls-certs\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.304131 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-combined-ca-bundle\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.305387 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-config\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.305409 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm4k4\" (UniqueName: \"kubernetes.io/projected/d1342b7d-3a59-4236-9673-f0b377a5657d-kube-api-access-wm4k4\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.305443 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-ovndb-tls-certs\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.305480 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-internal-tls-certs\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.305498 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnc9t\" (UniqueName: \"kubernetes.io/projected/66ccc10f-a153-4582-ab8d-f687b0c6bb20-kube-api-access-nnc9t\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.305526 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-config-data\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.305543 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-httpd-config\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.305561 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1342b7d-3a59-4236-9673-f0b377a5657d-logs\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.315034 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-public-tls-certs\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.315105 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-combined-ca-bundle\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.315793 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-config-data\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.319676 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-httpd-config\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.320211 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-internal-tls-certs\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.321212 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-ovndb-tls-certs\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.321630 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.322109 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/66ccc10f-a153-4582-ab8d-f687b0c6bb20-config\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.322809 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-combined-ca-bundle\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.324304 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-config-data-custom\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.325578 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnc9t\" (UniqueName: \"kubernetes.io/projected/66ccc10f-a153-4582-ab8d-f687b0c6bb20-kube-api-access-nnc9t\") pod \"neutron-57c497f557-r9sp7\" (UID: \"66ccc10f-a153-4582-ab8d-f687b0c6bb20\") " pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.327992 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm4k4\" (UniqueName: \"kubernetes.io/projected/d1342b7d-3a59-4236-9673-f0b377a5657d-kube-api-access-wm4k4\") pod \"barbican-api-75fb97c486-4w76r\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.335410 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.451873 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.669951 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95762039-b403-4702-9c63-025018a0d833","Type":"ContainerStarted","Data":"3cf88dcc2049c6ad6f192ea88943ed16f2524e3a00b421e237b0f9b3608a6f93"} Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.684851 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" podUID="d3bd3075-1211-402c-9e19-a8057ee182ea" containerName="dnsmasq-dns" containerID="cri-o://b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf" gracePeriod=10 Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.686081 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" event={"ID":"d3bd3075-1211-402c-9e19-a8057ee182ea","Type":"ContainerStarted","Data":"b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf"} Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.686168 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.699114 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7d9fc5569-wdqkp"] Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.702682 4693 generic.go:334] "Generic (PLEG): container finished" podID="fd001ffc-9a83-408f-bc46-9a7cacf052c7" containerID="a54d415c0cd66def5d5567c9cf0aac4246321420e6dd522d9247dcb5e9c9be6c" exitCode=0 Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.703018 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kqn9c" event={"ID":"fd001ffc-9a83-408f-bc46-9a7cacf052c7","Type":"ContainerDied","Data":"a54d415c0cd66def5d5567c9cf0aac4246321420e6dd522d9247dcb5e9c9be6c"} Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.722677 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" podStartSLOduration=3.7226605 podStartE2EDuration="3.7226605s" podCreationTimestamp="2025-11-25 12:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:49.711592256 +0000 UTC m=+1189.629677637" watchObservedRunningTime="2025-11-25 12:27:49.7226605 +0000 UTC m=+1189.640745881" Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.814875 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d949d856b-fbdcg"] Nov 25 12:27:49 crc kubenswrapper[4693]: W1125 12:27:49.815865 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc743d467_4bdb_41ce_bf74_5051a93fc3d6.slice/crio-8623d87cc34a8ced86edf65d1085cae39163dc10f8c6539934c4c74649a7259a WatchSource:0}: Error finding container 8623d87cc34a8ced86edf65d1085cae39163dc10f8c6539934c4c74649a7259a: Status 404 returned error can't find the container with id 8623d87cc34a8ced86edf65d1085cae39163dc10f8c6539934c4c74649a7259a Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.969103 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75fb97c486-4w76r"] Nov 25 12:27:49 crc kubenswrapper[4693]: I1125 12:27:49.990195 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b66f7449-hxwtk"] Nov 25 12:27:50 crc kubenswrapper[4693]: W1125 12:27:50.036003 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd089afe1_1724_42ca_8204_355e98e903f2.slice/crio-71205812107c22ee15bca0e8208c1372100375756f42b2455fc8f731eda7571e WatchSource:0}: Error finding container 71205812107c22ee15bca0e8208c1372100375756f42b2455fc8f731eda7571e: Status 404 returned error can't find the container with id 71205812107c22ee15bca0e8208c1372100375756f42b2455fc8f731eda7571e Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.185551 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57c497f557-r9sp7"] Nov 25 12:27:50 crc kubenswrapper[4693]: W1125 12:27:50.189861 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66ccc10f_a153_4582_ab8d_f687b0c6bb20.slice/crio-70ad8f700829b45bff541fd986a7ad29db888ee2d8be076db45022c124daf631 WatchSource:0}: Error finding container 70ad8f700829b45bff541fd986a7ad29db888ee2d8be076db45022c124daf631: Status 404 returned error can't find the container with id 70ad8f700829b45bff541fd986a7ad29db888ee2d8be076db45022c124daf631 Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.268997 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.329239 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-dns-swift-storage-0\") pod \"d3bd3075-1211-402c-9e19-a8057ee182ea\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.332817 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-ovsdbserver-nb\") pod \"d3bd3075-1211-402c-9e19-a8057ee182ea\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.332876 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrr48\" (UniqueName: \"kubernetes.io/projected/d3bd3075-1211-402c-9e19-a8057ee182ea-kube-api-access-zrr48\") pod \"d3bd3075-1211-402c-9e19-a8057ee182ea\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.332954 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-dns-svc\") pod \"d3bd3075-1211-402c-9e19-a8057ee182ea\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.333049 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-config\") pod \"d3bd3075-1211-402c-9e19-a8057ee182ea\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.333124 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-ovsdbserver-sb\") pod \"d3bd3075-1211-402c-9e19-a8057ee182ea\" (UID: \"d3bd3075-1211-402c-9e19-a8057ee182ea\") " Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.339143 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3bd3075-1211-402c-9e19-a8057ee182ea-kube-api-access-zrr48" (OuterVolumeSpecName: "kube-api-access-zrr48") pod "d3bd3075-1211-402c-9e19-a8057ee182ea" (UID: "d3bd3075-1211-402c-9e19-a8057ee182ea"). InnerVolumeSpecName "kube-api-access-zrr48". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.396046 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d3bd3075-1211-402c-9e19-a8057ee182ea" (UID: "d3bd3075-1211-402c-9e19-a8057ee182ea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.397465 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d3bd3075-1211-402c-9e19-a8057ee182ea" (UID: "d3bd3075-1211-402c-9e19-a8057ee182ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.436550 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.436581 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrr48\" (UniqueName: \"kubernetes.io/projected/d3bd3075-1211-402c-9e19-a8057ee182ea-kube-api-access-zrr48\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.436593 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.441776 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d3bd3075-1211-402c-9e19-a8057ee182ea" (UID: "d3bd3075-1211-402c-9e19-a8057ee182ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.441834 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d3bd3075-1211-402c-9e19-a8057ee182ea" (UID: "d3bd3075-1211-402c-9e19-a8057ee182ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.451662 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-config" (OuterVolumeSpecName: "config") pod "d3bd3075-1211-402c-9e19-a8057ee182ea" (UID: "d3bd3075-1211-402c-9e19-a8057ee182ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.539501 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.539778 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.539790 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bd3075-1211-402c-9e19-a8057ee182ea-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.712413 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75fb97c486-4w76r" event={"ID":"d1342b7d-3a59-4236-9673-f0b377a5657d","Type":"ContainerStarted","Data":"60629e8736cd7781cad2195e7d50a379bdfd10cacd5131cbd36cb0edc07606f7"} Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.712458 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75fb97c486-4w76r" event={"ID":"d1342b7d-3a59-4236-9673-f0b377a5657d","Type":"ContainerStarted","Data":"ad0bf7300e1387eaba5190c6c5d1df54634ae87a3c146b97422f5dae5aff3653"} Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.712467 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75fb97c486-4w76r" event={"ID":"d1342b7d-3a59-4236-9673-f0b377a5657d","Type":"ContainerStarted","Data":"d7961009b26615ac257554386c2cab8bd2605925fa87b1ebc1ec4bdf2941f3cd"} Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.714689 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.714753 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.716483 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57c497f557-r9sp7" event={"ID":"66ccc10f-a153-4582-ab8d-f687b0c6bb20","Type":"ContainerStarted","Data":"9f1e9c0185e291abb03516ac6a38aedbc9b34d7075ca9bc39b77ba73586c6749"} Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.716517 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57c497f557-r9sp7" event={"ID":"66ccc10f-a153-4582-ab8d-f687b0c6bb20","Type":"ContainerStarted","Data":"70ad8f700829b45bff541fd986a7ad29db888ee2d8be076db45022c124daf631"} Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.727437 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" event={"ID":"c743d467-4bdb-41ce-bf74-5051a93fc3d6","Type":"ContainerStarted","Data":"8623d87cc34a8ced86edf65d1085cae39163dc10f8c6539934c4c74649a7259a"} Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.743602 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-75fb97c486-4w76r" podStartSLOduration=2.7435815569999997 podStartE2EDuration="2.743581557s" podCreationTimestamp="2025-11-25 12:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:50.741230961 +0000 UTC m=+1190.659316352" watchObservedRunningTime="2025-11-25 12:27:50.743581557 +0000 UTC m=+1190.661666938" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.753843 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95762039-b403-4702-9c63-025018a0d833","Type":"ContainerStarted","Data":"9fe7c42f488b278c479461f0c4076420123053701d3ec2aff03e9d259fe36b65"} Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.768802 4693 generic.go:334] "Generic (PLEG): container finished" podID="d3bd3075-1211-402c-9e19-a8057ee182ea" containerID="b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf" exitCode=0 Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.768865 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" event={"ID":"d3bd3075-1211-402c-9e19-a8057ee182ea","Type":"ContainerDied","Data":"b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf"} Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.768893 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" event={"ID":"d3bd3075-1211-402c-9e19-a8057ee182ea","Type":"ContainerDied","Data":"e419e9dae978c7dc8ab0fe544a733f86ef285ccd5dacb96cee87298474f6f731"} Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.768909 4693 scope.go:117] "RemoveContainer" containerID="b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.769055 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b95cfcf9c-rdbk5" Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.778329 4693 generic.go:334] "Generic (PLEG): container finished" podID="d089afe1-1724-42ca-8204-355e98e903f2" containerID="b350663654874812aa92c26c0cc289cb50376da4f4034c458ab5d5bb68228b97" exitCode=0 Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.778514 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" event={"ID":"d089afe1-1724-42ca-8204-355e98e903f2","Type":"ContainerDied","Data":"b350663654874812aa92c26c0cc289cb50376da4f4034c458ab5d5bb68228b97"} Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.778607 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" event={"ID":"d089afe1-1724-42ca-8204-355e98e903f2","Type":"ContainerStarted","Data":"71205812107c22ee15bca0e8208c1372100375756f42b2455fc8f731eda7571e"} Nov 25 12:27:50 crc kubenswrapper[4693]: I1125 12:27:50.792520 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d9fc5569-wdqkp" event={"ID":"084144ce-d043-4dd8-bc4b-e904c42e47cd","Type":"ContainerStarted","Data":"65f094cf9d9c8f83c021b20e157797c9355c85eea778d5b9adf3df0e3fe729aa"} Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.048205 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b95cfcf9c-rdbk5"] Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.054594 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b95cfcf9c-rdbk5"] Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.369936 4693 scope.go:117] "RemoveContainer" containerID="de4cdf52578656f0d42ce09af3914cfc0ce6a04f6fb3cd05a7d190aee226ac36" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.498538 4693 scope.go:117] "RemoveContainer" containerID="b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf" Nov 25 12:27:51 crc kubenswrapper[4693]: E1125 12:27:51.511263 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf\": container with ID starting with b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf not found: ID does not exist" containerID="b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.511314 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf"} err="failed to get container status \"b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf\": rpc error: code = NotFound desc = could not find container \"b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf\": container with ID starting with b4b4cbcbee2fd010341bf6c9b5b6a8a8368c831a67c333fd3795993503274baf not found: ID does not exist" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.511353 4693 scope.go:117] "RemoveContainer" containerID="de4cdf52578656f0d42ce09af3914cfc0ce6a04f6fb3cd05a7d190aee226ac36" Nov 25 12:27:51 crc kubenswrapper[4693]: E1125 12:27:51.512446 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4cdf52578656f0d42ce09af3914cfc0ce6a04f6fb3cd05a7d190aee226ac36\": container with ID starting with de4cdf52578656f0d42ce09af3914cfc0ce6a04f6fb3cd05a7d190aee226ac36 not found: ID does not exist" containerID="de4cdf52578656f0d42ce09af3914cfc0ce6a04f6fb3cd05a7d190aee226ac36" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.512489 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4cdf52578656f0d42ce09af3914cfc0ce6a04f6fb3cd05a7d190aee226ac36"} err="failed to get container status \"de4cdf52578656f0d42ce09af3914cfc0ce6a04f6fb3cd05a7d190aee226ac36\": rpc error: code = NotFound desc = could not find container \"de4cdf52578656f0d42ce09af3914cfc0ce6a04f6fb3cd05a7d190aee226ac36\": container with ID starting with de4cdf52578656f0d42ce09af3914cfc0ce6a04f6fb3cd05a7d190aee226ac36 not found: ID does not exist" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.547427 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.664127 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd001ffc-9a83-408f-bc46-9a7cacf052c7-etc-machine-id\") pod \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.664174 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nglx\" (UniqueName: \"kubernetes.io/projected/fd001ffc-9a83-408f-bc46-9a7cacf052c7-kube-api-access-4nglx\") pod \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.664210 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-scripts\") pod \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.664275 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-config-data\") pod \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.664339 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-combined-ca-bundle\") pod \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.664571 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-db-sync-config-data\") pod \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\" (UID: \"fd001ffc-9a83-408f-bc46-9a7cacf052c7\") " Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.664269 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd001ffc-9a83-408f-bc46-9a7cacf052c7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fd001ffc-9a83-408f-bc46-9a7cacf052c7" (UID: "fd001ffc-9a83-408f-bc46-9a7cacf052c7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.665192 4693 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd001ffc-9a83-408f-bc46-9a7cacf052c7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.671313 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-scripts" (OuterVolumeSpecName: "scripts") pod "fd001ffc-9a83-408f-bc46-9a7cacf052c7" (UID: "fd001ffc-9a83-408f-bc46-9a7cacf052c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.671397 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd001ffc-9a83-408f-bc46-9a7cacf052c7-kube-api-access-4nglx" (OuterVolumeSpecName: "kube-api-access-4nglx") pod "fd001ffc-9a83-408f-bc46-9a7cacf052c7" (UID: "fd001ffc-9a83-408f-bc46-9a7cacf052c7"). InnerVolumeSpecName "kube-api-access-4nglx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.672943 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fd001ffc-9a83-408f-bc46-9a7cacf052c7" (UID: "fd001ffc-9a83-408f-bc46-9a7cacf052c7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.695602 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd001ffc-9a83-408f-bc46-9a7cacf052c7" (UID: "fd001ffc-9a83-408f-bc46-9a7cacf052c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.727183 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-config-data" (OuterVolumeSpecName: "config-data") pod "fd001ffc-9a83-408f-bc46-9a7cacf052c7" (UID: "fd001ffc-9a83-408f-bc46-9a7cacf052c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.766415 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nglx\" (UniqueName: \"kubernetes.io/projected/fd001ffc-9a83-408f-bc46-9a7cacf052c7-kube-api-access-4nglx\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.766448 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.766457 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.766466 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.766474 4693 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd001ffc-9a83-408f-bc46-9a7cacf052c7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.805608 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" event={"ID":"d089afe1-1724-42ca-8204-355e98e903f2","Type":"ContainerStarted","Data":"d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133"} Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.806703 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.814149 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57c497f557-r9sp7" event={"ID":"66ccc10f-a153-4582-ab8d-f687b0c6bb20","Type":"ContainerStarted","Data":"26061ae90bc49b4b70bb36af842f4adf3d63adb917290425ab89de4be39021d0"} Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.814290 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.829667 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-kqn9c" event={"ID":"fd001ffc-9a83-408f-bc46-9a7cacf052c7","Type":"ContainerDied","Data":"54ca02c8bc011bc57eaeecf8ede7c61737d9421bfa9ad17258191c951084fc91"} Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.829705 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54ca02c8bc011bc57eaeecf8ede7c61737d9421bfa9ad17258191c951084fc91" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.829709 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-kqn9c" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.845288 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" podStartSLOduration=3.845269903 podStartE2EDuration="3.845269903s" podCreationTimestamp="2025-11-25 12:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:51.842718081 +0000 UTC m=+1191.760803472" watchObservedRunningTime="2025-11-25 12:27:51.845269903 +0000 UTC m=+1191.763355284" Nov 25 12:27:51 crc kubenswrapper[4693]: I1125 12:27:51.904636 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57c497f557-r9sp7" podStartSLOduration=2.904614155 podStartE2EDuration="2.904614155s" podCreationTimestamp="2025-11-25 12:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:51.889578669 +0000 UTC m=+1191.807664060" watchObservedRunningTime="2025-11-25 12:27:51.904614155 +0000 UTC m=+1191.822699536" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.025786 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 12:27:52 crc kubenswrapper[4693]: E1125 12:27:52.026125 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd001ffc-9a83-408f-bc46-9a7cacf052c7" containerName="cinder-db-sync" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.026140 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd001ffc-9a83-408f-bc46-9a7cacf052c7" containerName="cinder-db-sync" Nov 25 12:27:52 crc kubenswrapper[4693]: E1125 12:27:52.026159 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3bd3075-1211-402c-9e19-a8057ee182ea" containerName="init" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.026165 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bd3075-1211-402c-9e19-a8057ee182ea" containerName="init" Nov 25 12:27:52 crc kubenswrapper[4693]: E1125 12:27:52.026177 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3bd3075-1211-402c-9e19-a8057ee182ea" containerName="dnsmasq-dns" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.026183 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bd3075-1211-402c-9e19-a8057ee182ea" containerName="dnsmasq-dns" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.026395 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd001ffc-9a83-408f-bc46-9a7cacf052c7" containerName="cinder-db-sync" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.026418 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3bd3075-1211-402c-9e19-a8057ee182ea" containerName="dnsmasq-dns" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.027307 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.031443 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6pxfk" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.031592 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.032240 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.034527 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.046996 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.072362 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-scripts\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.072443 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.072484 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.072502 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-config-data\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.072556 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.072593 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fllqm\" (UniqueName: \"kubernetes.io/projected/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-kube-api-access-fllqm\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.121384 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b66f7449-hxwtk"] Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.168329 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7965876c4f-pqjb5"] Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.169727 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.182327 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-scripts\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.182656 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.182699 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.182720 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-config-data\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.182765 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.182811 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fllqm\" (UniqueName: \"kubernetes.io/projected/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-kube-api-access-fllqm\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.183124 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.191212 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.191693 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-scripts\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.192123 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.193436 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-config-data\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.206218 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7965876c4f-pqjb5"] Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.219210 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fllqm\" (UniqueName: \"kubernetes.io/projected/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-kube-api-access-fllqm\") pod \"cinder-scheduler-0\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.284575 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkvfh\" (UniqueName: \"kubernetes.io/projected/4e5da329-8105-4bd9-a340-65d273d51bbf-kube-api-access-mkvfh\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.284649 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-ovsdbserver-nb\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.284685 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-ovsdbserver-sb\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.284715 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-dns-swift-storage-0\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.284762 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-dns-svc\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.284795 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-config\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.358338 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.384833 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.386451 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkvfh\" (UniqueName: \"kubernetes.io/projected/4e5da329-8105-4bd9-a340-65d273d51bbf-kube-api-access-mkvfh\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.386507 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-ovsdbserver-nb\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.386539 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-ovsdbserver-sb\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.386564 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-dns-swift-storage-0\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.386603 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-dns-svc\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.386650 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-config\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.387486 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-config\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.388232 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-ovsdbserver-nb\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.389228 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-dns-svc\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.389296 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-ovsdbserver-sb\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.389953 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-dns-swift-storage-0\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.390081 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.393340 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.400893 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.405907 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkvfh\" (UniqueName: \"kubernetes.io/projected/4e5da329-8105-4bd9-a340-65d273d51bbf-kube-api-access-mkvfh\") pod \"dnsmasq-dns-7965876c4f-pqjb5\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.490082 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-logs\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.490158 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-config-data\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.490318 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.490351 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.490428 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86d86\" (UniqueName: \"kubernetes.io/projected/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-kube-api-access-86d86\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.490459 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-scripts\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.490497 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.592782 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86d86\" (UniqueName: \"kubernetes.io/projected/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-kube-api-access-86d86\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.592879 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-scripts\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.592964 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.593037 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-logs\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.593128 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-config-data\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.593209 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.593236 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.601595 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.601901 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-logs\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.602401 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.605829 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-config-data-custom\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.607886 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-config-data\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.609809 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-scripts\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.612513 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86d86\" (UniqueName: \"kubernetes.io/projected/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-kube-api-access-86d86\") pod \"cinder-api-0\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.612805 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.751450 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 12:27:52 crc kubenswrapper[4693]: I1125 12:27:52.825993 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3bd3075-1211-402c-9e19-a8057ee182ea" path="/var/lib/kubelet/pods/d3bd3075-1211-402c-9e19-a8057ee182ea/volumes" Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.464561 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7965876c4f-pqjb5"] Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.547442 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.570309 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 12:27:53 crc kubenswrapper[4693]: W1125 12:27:53.578362 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd00fccc_6a75_4f2a_a950_4e6b6ea3fa2a.slice/crio-1b5b43571192b932f03219da31097743a9c799a6098a545a56e8f6c192c5fb15 WatchSource:0}: Error finding container 1b5b43571192b932f03219da31097743a9c799a6098a545a56e8f6c192c5fb15: Status 404 returned error can't find the container with id 1b5b43571192b932f03219da31097743a9c799a6098a545a56e8f6c192c5fb15 Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.648588 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-574fd6fdfd-bz6sm" podUID="14ff5a36-1912-43a8-b87f-57a6858a5799" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.875522 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" event={"ID":"c743d467-4bdb-41ce-bf74-5051a93fc3d6","Type":"ContainerStarted","Data":"27b4613f3b7fd9473ede6aed270fa6f8e450b81d90578df952b76cdba9ba15a6"} Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.875564 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" event={"ID":"c743d467-4bdb-41ce-bf74-5051a93fc3d6","Type":"ContainerStarted","Data":"169727d8c07f25345dd8f69a10ab9d1afd7760dc4e385b0fa217efcc3937dc95"} Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.880526 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95762039-b403-4702-9c63-025018a0d833","Type":"ContainerStarted","Data":"9324239db46fe8a9ef39280b7f2d83c67bd77feeaff39d2316a74fe36197055e"} Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.880710 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.892385 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" event={"ID":"4e5da329-8105-4bd9-a340-65d273d51bbf","Type":"ContainerStarted","Data":"f06b4b7b5a2207d0b95f80c9eb232716289bd7501129a1ad16981ee9d8b4cf74"} Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.892463 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" event={"ID":"4e5da329-8105-4bd9-a340-65d273d51bbf","Type":"ContainerStarted","Data":"2f4c1995c4b40c60316dcf1a281667fd7060a2b5d387247fa09721791e0315c4"} Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.903423 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a","Type":"ContainerStarted","Data":"1b5b43571192b932f03219da31097743a9c799a6098a545a56e8f6c192c5fb15"} Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.913620 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.914515 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.942723 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d9fc5569-wdqkp" event={"ID":"084144ce-d043-4dd8-bc4b-e904c42e47cd","Type":"ContainerStarted","Data":"88d0c1f8ab9f3f03978e7932bf491c0b57db79aa24fdd81294faba0bb1a8c1e7"} Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.942775 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7d9fc5569-wdqkp" event={"ID":"084144ce-d043-4dd8-bc4b-e904c42e47cd","Type":"ContainerStarted","Data":"589e3e136bc7f0c082d6de510e3c86d58c5148ed52f491c4ad1ecb511b542d05"} Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.944973 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7d949d856b-fbdcg" podStartSLOduration=2.832700334 podStartE2EDuration="5.944959208s" podCreationTimestamp="2025-11-25 12:27:48 +0000 UTC" firstStartedPulling="2025-11-25 12:27:49.826655457 +0000 UTC m=+1189.744740838" lastFinishedPulling="2025-11-25 12:27:52.938914331 +0000 UTC m=+1192.856999712" observedRunningTime="2025-11-25 12:27:53.904796859 +0000 UTC m=+1193.822882240" watchObservedRunningTime="2025-11-25 12:27:53.944959208 +0000 UTC m=+1193.863044599" Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.982236 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" podUID="d089afe1-1724-42ca-8204-355e98e903f2" containerName="dnsmasq-dns" containerID="cri-o://d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133" gracePeriod=10 Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.982438 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ed0f5e8-1099-41d9-b699-545f3dd11bdb","Type":"ContainerStarted","Data":"891e4c98517eb7b31df9d8f2c682b583994fec2606330ceb3c18649addda1c4f"} Nov 25 12:27:53 crc kubenswrapper[4693]: I1125 12:27:53.985279 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.781821522 podStartE2EDuration="7.985255629s" podCreationTimestamp="2025-11-25 12:27:46 +0000 UTC" firstStartedPulling="2025-11-25 12:27:47.739788856 +0000 UTC m=+1187.657874237" lastFinishedPulling="2025-11-25 12:27:52.943222963 +0000 UTC m=+1192.861308344" observedRunningTime="2025-11-25 12:27:53.982876722 +0000 UTC m=+1193.900962103" watchObservedRunningTime="2025-11-25 12:27:53.985255629 +0000 UTC m=+1193.903341010" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.007533 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7d9fc5569-wdqkp" podStartSLOduration=2.770651305 podStartE2EDuration="6.00750949s" podCreationTimestamp="2025-11-25 12:27:48 +0000 UTC" firstStartedPulling="2025-11-25 12:27:49.708762606 +0000 UTC m=+1189.626847987" lastFinishedPulling="2025-11-25 12:27:52.945620791 +0000 UTC m=+1192.863706172" observedRunningTime="2025-11-25 12:27:54.004570936 +0000 UTC m=+1193.922656317" watchObservedRunningTime="2025-11-25 12:27:54.00750949 +0000 UTC m=+1193.925594871" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.022617 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.023007 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.023030 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.040286 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.094312 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.133223 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.496017 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.660799 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q9sj\" (UniqueName: \"kubernetes.io/projected/d089afe1-1724-42ca-8204-355e98e903f2-kube-api-access-2q9sj\") pod \"d089afe1-1724-42ca-8204-355e98e903f2\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.660885 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-dns-svc\") pod \"d089afe1-1724-42ca-8204-355e98e903f2\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.660935 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-dns-swift-storage-0\") pod \"d089afe1-1724-42ca-8204-355e98e903f2\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.660956 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-ovsdbserver-nb\") pod \"d089afe1-1724-42ca-8204-355e98e903f2\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.661015 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-ovsdbserver-sb\") pod \"d089afe1-1724-42ca-8204-355e98e903f2\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.661073 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-config\") pod \"d089afe1-1724-42ca-8204-355e98e903f2\" (UID: \"d089afe1-1724-42ca-8204-355e98e903f2\") " Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.668108 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d089afe1-1724-42ca-8204-355e98e903f2-kube-api-access-2q9sj" (OuterVolumeSpecName: "kube-api-access-2q9sj") pod "d089afe1-1724-42ca-8204-355e98e903f2" (UID: "d089afe1-1724-42ca-8204-355e98e903f2"). InnerVolumeSpecName "kube-api-access-2q9sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.721280 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d089afe1-1724-42ca-8204-355e98e903f2" (UID: "d089afe1-1724-42ca-8204-355e98e903f2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.727154 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d089afe1-1724-42ca-8204-355e98e903f2" (UID: "d089afe1-1724-42ca-8204-355e98e903f2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.738039 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d089afe1-1724-42ca-8204-355e98e903f2" (UID: "d089afe1-1724-42ca-8204-355e98e903f2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.742542 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-config" (OuterVolumeSpecName: "config") pod "d089afe1-1724-42ca-8204-355e98e903f2" (UID: "d089afe1-1724-42ca-8204-355e98e903f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.743039 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d089afe1-1724-42ca-8204-355e98e903f2" (UID: "d089afe1-1724-42ca-8204-355e98e903f2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.762812 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q9sj\" (UniqueName: \"kubernetes.io/projected/d089afe1-1724-42ca-8204-355e98e903f2-kube-api-access-2q9sj\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.762849 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.762859 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.762878 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.762889 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.762901 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d089afe1-1724-42ca-8204-355e98e903f2-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:54 crc kubenswrapper[4693]: I1125 12:27:54.975184 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.039529 4693 generic.go:334] "Generic (PLEG): container finished" podID="4e5da329-8105-4bd9-a340-65d273d51bbf" containerID="f06b4b7b5a2207d0b95f80c9eb232716289bd7501129a1ad16981ee9d8b4cf74" exitCode=0 Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.039679 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" event={"ID":"4e5da329-8105-4bd9-a340-65d273d51bbf","Type":"ContainerDied","Data":"f06b4b7b5a2207d0b95f80c9eb232716289bd7501129a1ad16981ee9d8b4cf74"} Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.039710 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" event={"ID":"4e5da329-8105-4bd9-a340-65d273d51bbf","Type":"ContainerStarted","Data":"2b59293260077cf5251994d285be38340a2349fa8b1d78ef6227a85fd010b5df"} Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.047187 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.053151 4693 generic.go:334] "Generic (PLEG): container finished" podID="d089afe1-1724-42ca-8204-355e98e903f2" containerID="d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133" exitCode=0 Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.053215 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" event={"ID":"d089afe1-1724-42ca-8204-355e98e903f2","Type":"ContainerDied","Data":"d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133"} Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.053241 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" event={"ID":"d089afe1-1724-42ca-8204-355e98e903f2","Type":"ContainerDied","Data":"71205812107c22ee15bca0e8208c1372100375756f42b2455fc8f731eda7571e"} Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.053255 4693 scope.go:117] "RemoveContainer" containerID="d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.053396 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b66f7449-hxwtk" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.061305 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ed0f5e8-1099-41d9-b699-545f3dd11bdb","Type":"ContainerStarted","Data":"e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43"} Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.062096 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.062135 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.062144 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.062154 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.085683 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" podStartSLOduration=3.085660989 podStartE2EDuration="3.085660989s" podCreationTimestamp="2025-11-25 12:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:55.076721415 +0000 UTC m=+1194.994806796" watchObservedRunningTime="2025-11-25 12:27:55.085660989 +0000 UTC m=+1195.003746370" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.099101 4693 scope.go:117] "RemoveContainer" containerID="b350663654874812aa92c26c0cc289cb50376da4f4034c458ab5d5bb68228b97" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.101144 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b66f7449-hxwtk"] Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.121455 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66b66f7449-hxwtk"] Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.208039 4693 scope.go:117] "RemoveContainer" containerID="d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133" Nov 25 12:27:55 crc kubenswrapper[4693]: E1125 12:27:55.208531 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133\": container with ID starting with d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133 not found: ID does not exist" containerID="d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.208570 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133"} err="failed to get container status \"d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133\": rpc error: code = NotFound desc = could not find container \"d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133\": container with ID starting with d258898b864bcb13d08902b6bdbeec1e11a2643e41b72a940dcdfe7e93fc6133 not found: ID does not exist" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.208598 4693 scope.go:117] "RemoveContainer" containerID="b350663654874812aa92c26c0cc289cb50376da4f4034c458ab5d5bb68228b97" Nov 25 12:27:55 crc kubenswrapper[4693]: E1125 12:27:55.208917 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b350663654874812aa92c26c0cc289cb50376da4f4034c458ab5d5bb68228b97\": container with ID starting with b350663654874812aa92c26c0cc289cb50376da4f4034c458ab5d5bb68228b97 not found: ID does not exist" containerID="b350663654874812aa92c26c0cc289cb50376da4f4034c458ab5d5bb68228b97" Nov 25 12:27:55 crc kubenswrapper[4693]: I1125 12:27:55.209259 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b350663654874812aa92c26c0cc289cb50376da4f4034c458ab5d5bb68228b97"} err="failed to get container status \"b350663654874812aa92c26c0cc289cb50376da4f4034c458ab5d5bb68228b97\": rpc error: code = NotFound desc = could not find container \"b350663654874812aa92c26c0cc289cb50376da4f4034c458ab5d5bb68228b97\": container with ID starting with b350663654874812aa92c26c0cc289cb50376da4f4034c458ab5d5bb68228b97 not found: ID does not exist" Nov 25 12:27:56 crc kubenswrapper[4693]: I1125 12:27:56.074346 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a","Type":"ContainerStarted","Data":"4dbcf025859eb1880605762156ae96e1c45ab3b7310d4464478b81cd2072a4e2"} Nov 25 12:27:56 crc kubenswrapper[4693]: I1125 12:27:56.077128 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ed0f5e8-1099-41d9-b699-545f3dd11bdb","Type":"ContainerStarted","Data":"a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c"} Nov 25 12:27:56 crc kubenswrapper[4693]: I1125 12:27:56.077537 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 25 12:27:56 crc kubenswrapper[4693]: I1125 12:27:56.105774 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.105750563 podStartE2EDuration="4.105750563s" podCreationTimestamp="2025-11-25 12:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:27:56.09999312 +0000 UTC m=+1196.018078501" watchObservedRunningTime="2025-11-25 12:27:56.105750563 +0000 UTC m=+1196.023835944" Nov 25 12:27:56 crc kubenswrapper[4693]: I1125 12:27:56.825956 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d089afe1-1724-42ca-8204-355e98e903f2" path="/var/lib/kubelet/pods/d089afe1-1724-42ca-8204-355e98e903f2/volumes" Nov 25 12:27:56 crc kubenswrapper[4693]: I1125 12:27:56.866441 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.087682 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a","Type":"ContainerStarted","Data":"a2a9b8694e0ee2319282287d84825b4cde5ac4bd358c44f61561e1d7eda1096b"} Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.087777 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.087790 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.087812 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.087846 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.106952 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.221787229 podStartE2EDuration="6.106934691s" podCreationTimestamp="2025-11-25 12:27:51 +0000 UTC" firstStartedPulling="2025-11-25 12:27:53.596914985 +0000 UTC m=+1193.515000366" lastFinishedPulling="2025-11-25 12:27:54.482062437 +0000 UTC m=+1194.400147828" observedRunningTime="2025-11-25 12:27:57.105160051 +0000 UTC m=+1197.023245452" watchObservedRunningTime="2025-11-25 12:27:57.106934691 +0000 UTC m=+1197.025020072" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.359108 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.500049 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5c49f9f854-clbv4"] Nov 25 12:27:57 crc kubenswrapper[4693]: E1125 12:27:57.500439 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d089afe1-1724-42ca-8204-355e98e903f2" containerName="init" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.500450 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d089afe1-1724-42ca-8204-355e98e903f2" containerName="init" Nov 25 12:27:57 crc kubenswrapper[4693]: E1125 12:27:57.500471 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d089afe1-1724-42ca-8204-355e98e903f2" containerName="dnsmasq-dns" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.500480 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d089afe1-1724-42ca-8204-355e98e903f2" containerName="dnsmasq-dns" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.500662 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d089afe1-1724-42ca-8204-355e98e903f2" containerName="dnsmasq-dns" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.503926 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.509977 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c49f9f854-clbv4"] Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.515807 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.516010 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.622028 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-config-data-custom\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.622073 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wb2\" (UniqueName: \"kubernetes.io/projected/0c770547-1b83-43a7-ac47-82226ce02958-kube-api-access-26wb2\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.622105 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-public-tls-certs\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.622168 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c770547-1b83-43a7-ac47-82226ce02958-logs\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.622204 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-config-data\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.622238 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-combined-ca-bundle\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.622257 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-internal-tls-certs\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.723336 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c770547-1b83-43a7-ac47-82226ce02958-logs\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.723520 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-config-data\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.723567 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-combined-ca-bundle\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.723585 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-internal-tls-certs\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.723666 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-config-data-custom\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.723686 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26wb2\" (UniqueName: \"kubernetes.io/projected/0c770547-1b83-43a7-ac47-82226ce02958-kube-api-access-26wb2\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.723714 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-public-tls-certs\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.723912 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c770547-1b83-43a7-ac47-82226ce02958-logs\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.731141 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-config-data-custom\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.731948 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-public-tls-certs\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.739276 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-combined-ca-bundle\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.739283 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-internal-tls-certs\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.743167 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26wb2\" (UniqueName: \"kubernetes.io/projected/0c770547-1b83-43a7-ac47-82226ce02958-kube-api-access-26wb2\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.764573 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c770547-1b83-43a7-ac47-82226ce02958-config-data\") pod \"barbican-api-5c49f9f854-clbv4\" (UID: \"0c770547-1b83-43a7-ac47-82226ce02958\") " pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:57 crc kubenswrapper[4693]: I1125 12:27:57.829535 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:27:58 crc kubenswrapper[4693]: I1125 12:27:58.102747 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7ed0f5e8-1099-41d9-b699-545f3dd11bdb" containerName="cinder-api-log" containerID="cri-o://e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43" gracePeriod=30 Nov 25 12:27:58 crc kubenswrapper[4693]: I1125 12:27:58.103261 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7ed0f5e8-1099-41d9-b699-545f3dd11bdb" containerName="cinder-api" containerID="cri-o://a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c" gracePeriod=30 Nov 25 12:27:58 crc kubenswrapper[4693]: I1125 12:27:58.105082 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:27:58 crc kubenswrapper[4693]: I1125 12:27:58.164466 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:58 crc kubenswrapper[4693]: I1125 12:27:58.164901 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 12:27:58 crc kubenswrapper[4693]: I1125 12:27:58.358265 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c49f9f854-clbv4"] Nov 25 12:27:58 crc kubenswrapper[4693]: I1125 12:27:58.407932 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 25 12:27:58 crc kubenswrapper[4693]: I1125 12:27:58.408328 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 25 12:27:58 crc kubenswrapper[4693]: I1125 12:27:58.679463 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.036587 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.122678 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c49f9f854-clbv4" event={"ID":"0c770547-1b83-43a7-ac47-82226ce02958","Type":"ContainerStarted","Data":"f9d386ddd99a19a177baf6d47fe63321eec3176a1cbeaef45334777f953b28e9"} Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.122727 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c49f9f854-clbv4" event={"ID":"0c770547-1b83-43a7-ac47-82226ce02958","Type":"ContainerStarted","Data":"4c8300a598b4ce3287d877a81fb9ba5a3eaea3ef7264554c4e0b90757e65fde7"} Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.137192 4693 generic.go:334] "Generic (PLEG): container finished" podID="7ed0f5e8-1099-41d9-b699-545f3dd11bdb" containerID="a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c" exitCode=0 Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.137226 4693 generic.go:334] "Generic (PLEG): container finished" podID="7ed0f5e8-1099-41d9-b699-545f3dd11bdb" containerID="e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43" exitCode=143 Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.137472 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ed0f5e8-1099-41d9-b699-545f3dd11bdb","Type":"ContainerDied","Data":"a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c"} Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.137519 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ed0f5e8-1099-41d9-b699-545f3dd11bdb","Type":"ContainerDied","Data":"e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43"} Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.137532 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7ed0f5e8-1099-41d9-b699-545f3dd11bdb","Type":"ContainerDied","Data":"891e4c98517eb7b31df9d8f2c682b583994fec2606330ceb3c18649addda1c4f"} Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.137547 4693 scope.go:117] "RemoveContainer" containerID="a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.137688 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.172968 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-scripts\") pod \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.173288 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-combined-ca-bundle\") pod \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.174128 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-config-data\") pod \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.174248 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-etc-machine-id\") pod \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.174615 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-config-data-custom\") pod \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.174738 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86d86\" (UniqueName: \"kubernetes.io/projected/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-kube-api-access-86d86\") pod \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.174851 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-logs\") pod \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\" (UID: \"7ed0f5e8-1099-41d9-b699-545f3dd11bdb\") " Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.175743 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-logs" (OuterVolumeSpecName: "logs") pod "7ed0f5e8-1099-41d9-b699-545f3dd11bdb" (UID: "7ed0f5e8-1099-41d9-b699-545f3dd11bdb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.177092 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7ed0f5e8-1099-41d9-b699-545f3dd11bdb" (UID: "7ed0f5e8-1099-41d9-b699-545f3dd11bdb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.177655 4693 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.177680 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.187693 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-kube-api-access-86d86" (OuterVolumeSpecName: "kube-api-access-86d86") pod "7ed0f5e8-1099-41d9-b699-545f3dd11bdb" (UID: "7ed0f5e8-1099-41d9-b699-545f3dd11bdb"). InnerVolumeSpecName "kube-api-access-86d86". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.189668 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7ed0f5e8-1099-41d9-b699-545f3dd11bdb" (UID: "7ed0f5e8-1099-41d9-b699-545f3dd11bdb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.189746 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-scripts" (OuterVolumeSpecName: "scripts") pod "7ed0f5e8-1099-41d9-b699-545f3dd11bdb" (UID: "7ed0f5e8-1099-41d9-b699-545f3dd11bdb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.196989 4693 scope.go:117] "RemoveContainer" containerID="e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.218045 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ed0f5e8-1099-41d9-b699-545f3dd11bdb" (UID: "7ed0f5e8-1099-41d9-b699-545f3dd11bdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.234314 4693 scope.go:117] "RemoveContainer" containerID="a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c" Nov 25 12:27:59 crc kubenswrapper[4693]: E1125 12:27:59.235267 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c\": container with ID starting with a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c not found: ID does not exist" containerID="a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.235331 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c"} err="failed to get container status \"a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c\": rpc error: code = NotFound desc = could not find container \"a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c\": container with ID starting with a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c not found: ID does not exist" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.235365 4693 scope.go:117] "RemoveContainer" containerID="e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43" Nov 25 12:27:59 crc kubenswrapper[4693]: E1125 12:27:59.235684 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43\": container with ID starting with e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43 not found: ID does not exist" containerID="e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.235724 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43"} err="failed to get container status \"e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43\": rpc error: code = NotFound desc = could not find container \"e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43\": container with ID starting with e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43 not found: ID does not exist" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.235744 4693 scope.go:117] "RemoveContainer" containerID="a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.235929 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c"} err="failed to get container status \"a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c\": rpc error: code = NotFound desc = could not find container \"a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c\": container with ID starting with a91f4e19d68c2fbe4d7e38f7b8bb96813807cd783dc398d150b5eca46413086c not found: ID does not exist" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.235947 4693 scope.go:117] "RemoveContainer" containerID="e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.236200 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43"} err="failed to get container status \"e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43\": rpc error: code = NotFound desc = could not find container \"e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43\": container with ID starting with e0258f01435c6a89ad892f66cc389700d18b07f37b55551eed6e2fdaa827ed43 not found: ID does not exist" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.241339 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-config-data" (OuterVolumeSpecName: "config-data") pod "7ed0f5e8-1099-41d9-b699-545f3dd11bdb" (UID: "7ed0f5e8-1099-41d9-b699-545f3dd11bdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.279471 4693 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.279766 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86d86\" (UniqueName: \"kubernetes.io/projected/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-kube-api-access-86d86\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.279785 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.279798 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.279810 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ed0f5e8-1099-41d9-b699-545f3dd11bdb-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.480081 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.494735 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.526520 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 25 12:27:59 crc kubenswrapper[4693]: E1125 12:27:59.526960 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed0f5e8-1099-41d9-b699-545f3dd11bdb" containerName="cinder-api" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.526985 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed0f5e8-1099-41d9-b699-545f3dd11bdb" containerName="cinder-api" Nov 25 12:27:59 crc kubenswrapper[4693]: E1125 12:27:59.527007 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed0f5e8-1099-41d9-b699-545f3dd11bdb" containerName="cinder-api-log" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.527015 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed0f5e8-1099-41d9-b699-545f3dd11bdb" containerName="cinder-api-log" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.527259 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed0f5e8-1099-41d9-b699-545f3dd11bdb" containerName="cinder-api" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.527288 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed0f5e8-1099-41d9-b699-545f3dd11bdb" containerName="cinder-api-log" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.528728 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.530928 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.531528 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.532237 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.562792 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.689171 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxgp8\" (UniqueName: \"kubernetes.io/projected/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-kube-api-access-xxgp8\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.689275 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-scripts\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.689359 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.689577 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-logs\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.689658 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.689711 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.689741 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-config-data-custom\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.689826 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-config-data\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.689901 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.792998 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxgp8\" (UniqueName: \"kubernetes.io/projected/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-kube-api-access-xxgp8\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.793260 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-scripts\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.794737 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.794922 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-logs\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.794989 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.795045 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.795085 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-config-data-custom\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.795194 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-config-data\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.795273 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.795498 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.799017 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-logs\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.803632 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-config-data\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.804297 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.804524 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.805278 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-scripts\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.808235 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.811193 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-config-data-custom\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.824851 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxgp8\" (UniqueName: \"kubernetes.io/projected/8aeba5db-6f5b-4714-9d46-5db9b9058cb6-kube-api-access-xxgp8\") pod \"cinder-api-0\" (UID: \"8aeba5db-6f5b-4714-9d46-5db9b9058cb6\") " pod="openstack/cinder-api-0" Nov 25 12:27:59 crc kubenswrapper[4693]: I1125 12:27:59.843209 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 25 12:28:00 crc kubenswrapper[4693]: I1125 12:28:00.155990 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c49f9f854-clbv4" event={"ID":"0c770547-1b83-43a7-ac47-82226ce02958","Type":"ContainerStarted","Data":"bbf3b179f3972cf292b287dc3bacd9a0474cb1b4ef6f5a920bd586d5d54db95e"} Nov 25 12:28:00 crc kubenswrapper[4693]: I1125 12:28:00.156333 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:28:00 crc kubenswrapper[4693]: I1125 12:28:00.156350 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:28:00 crc kubenswrapper[4693]: I1125 12:28:00.347738 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5c49f9f854-clbv4" podStartSLOduration=3.347721658 podStartE2EDuration="3.347721658s" podCreationTimestamp="2025-11-25 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:28:00.183086022 +0000 UTC m=+1200.101171403" watchObservedRunningTime="2025-11-25 12:28:00.347721658 +0000 UTC m=+1200.265807039" Nov 25 12:28:00 crc kubenswrapper[4693]: I1125 12:28:00.350603 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 25 12:28:00 crc kubenswrapper[4693]: I1125 12:28:00.850284 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed0f5e8-1099-41d9-b699-545f3dd11bdb" path="/var/lib/kubelet/pods/7ed0f5e8-1099-41d9-b699-545f3dd11bdb/volumes" Nov 25 12:28:01 crc kubenswrapper[4693]: I1125 12:28:01.176134 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8aeba5db-6f5b-4714-9d46-5db9b9058cb6","Type":"ContainerStarted","Data":"1934ab8b28ef2f2c75719629b27db1130f6feb2b82eeee0458a867d36fa1a10a"} Nov 25 12:28:01 crc kubenswrapper[4693]: I1125 12:28:01.176180 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8aeba5db-6f5b-4714-9d46-5db9b9058cb6","Type":"ContainerStarted","Data":"40050d73ca521c55a4185d7a296249c2d47da81f0e475ad22ed3e83f363589cf"} Nov 25 12:28:01 crc kubenswrapper[4693]: I1125 12:28:01.390938 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:28:01 crc kubenswrapper[4693]: I1125 12:28:01.548171 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:28:02 crc kubenswrapper[4693]: I1125 12:28:02.186890 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8aeba5db-6f5b-4714-9d46-5db9b9058cb6","Type":"ContainerStarted","Data":"8a171555475982cba85a92a0388aa489f52dc9f1cab40012f6492b385615b341"} Nov 25 12:28:02 crc kubenswrapper[4693]: I1125 12:28:02.213154 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.213132593 podStartE2EDuration="3.213132593s" podCreationTimestamp="2025-11-25 12:27:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:28:02.2094754 +0000 UTC m=+1202.127560781" watchObservedRunningTime="2025-11-25 12:28:02.213132593 +0000 UTC m=+1202.131217974" Nov 25 12:28:02 crc kubenswrapper[4693]: I1125 12:28:02.575735 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 25 12:28:02 crc kubenswrapper[4693]: I1125 12:28:02.614536 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:28:02 crc kubenswrapper[4693]: I1125 12:28:02.617364 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 12:28:02 crc kubenswrapper[4693]: I1125 12:28:02.683657 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c8d5b9fc-mp5vw"] Nov 25 12:28:02 crc kubenswrapper[4693]: I1125 12:28:02.683882 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" podUID="4d599a55-d080-44b9-b2e3-1f94e1724ad6" containerName="dnsmasq-dns" containerID="cri-o://bc7e5d881cdd7f561933c7688600977769a0fecd40b302575bb4ea5fd25ae2d9" gracePeriod=10 Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.204926 4693 generic.go:334] "Generic (PLEG): container finished" podID="4d599a55-d080-44b9-b2e3-1f94e1724ad6" containerID="bc7e5d881cdd7f561933c7688600977769a0fecd40b302575bb4ea5fd25ae2d9" exitCode=0 Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.205805 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" event={"ID":"4d599a55-d080-44b9-b2e3-1f94e1724ad6","Type":"ContainerDied","Data":"bc7e5d881cdd7f561933c7688600977769a0fecd40b302575bb4ea5fd25ae2d9"} Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.205838 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" event={"ID":"4d599a55-d080-44b9-b2e3-1f94e1724ad6","Type":"ContainerDied","Data":"454a9d3ca8dd0c1496dd35195579f55681943c8a9aaaf82d9a5d15ab92acfe7b"} Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.205850 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="454a9d3ca8dd0c1496dd35195579f55681943c8a9aaaf82d9a5d15ab92acfe7b" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.205994 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" containerName="cinder-scheduler" containerID="cri-o://4dbcf025859eb1880605762156ae96e1c45ab3b7310d4464478b81cd2072a4e2" gracePeriod=30 Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.206109 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" containerName="probe" containerID="cri-o://a2a9b8694e0ee2319282287d84825b4cde5ac4bd358c44f61561e1d7eda1096b" gracePeriod=30 Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.206343 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.251963 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.289129 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-config\") pod \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.289260 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2mpj\" (UniqueName: \"kubernetes.io/projected/4d599a55-d080-44b9-b2e3-1f94e1724ad6-kube-api-access-w2mpj\") pod \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.289338 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-ovsdbserver-nb\") pod \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.289443 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-ovsdbserver-sb\") pod \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.289474 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-dns-svc\") pod \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.289503 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-dns-swift-storage-0\") pod \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\" (UID: \"4d599a55-d080-44b9-b2e3-1f94e1724ad6\") " Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.298039 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d599a55-d080-44b9-b2e3-1f94e1724ad6-kube-api-access-w2mpj" (OuterVolumeSpecName: "kube-api-access-w2mpj") pod "4d599a55-d080-44b9-b2e3-1f94e1724ad6" (UID: "4d599a55-d080-44b9-b2e3-1f94e1724ad6"). InnerVolumeSpecName "kube-api-access-w2mpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.358136 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4d599a55-d080-44b9-b2e3-1f94e1724ad6" (UID: "4d599a55-d080-44b9-b2e3-1f94e1724ad6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.374815 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-config" (OuterVolumeSpecName: "config") pod "4d599a55-d080-44b9-b2e3-1f94e1724ad6" (UID: "4d599a55-d080-44b9-b2e3-1f94e1724ad6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.392336 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.392401 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.392414 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2mpj\" (UniqueName: \"kubernetes.io/projected/4d599a55-d080-44b9-b2e3-1f94e1724ad6-kube-api-access-w2mpj\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.396773 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d599a55-d080-44b9-b2e3-1f94e1724ad6" (UID: "4d599a55-d080-44b9-b2e3-1f94e1724ad6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.416735 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4d599a55-d080-44b9-b2e3-1f94e1724ad6" (UID: "4d599a55-d080-44b9-b2e3-1f94e1724ad6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.425304 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4d599a55-d080-44b9-b2e3-1f94e1724ad6" (UID: "4d599a55-d080-44b9-b2e3-1f94e1724ad6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.493855 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.493894 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.493904 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d599a55-d080-44b9-b2e3-1f94e1724ad6-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:03 crc kubenswrapper[4693]: I1125 12:28:03.856219 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7888468d67-2bztz" Nov 25 12:28:04 crc kubenswrapper[4693]: I1125 12:28:04.216848 4693 generic.go:334] "Generic (PLEG): container finished" podID="dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" containerID="a2a9b8694e0ee2319282287d84825b4cde5ac4bd358c44f61561e1d7eda1096b" exitCode=0 Nov 25 12:28:04 crc kubenswrapper[4693]: I1125 12:28:04.216919 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a","Type":"ContainerDied","Data":"a2a9b8694e0ee2319282287d84825b4cde5ac4bd358c44f61561e1d7eda1096b"} Nov 25 12:28:04 crc kubenswrapper[4693]: I1125 12:28:04.216940 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c8d5b9fc-mp5vw" Nov 25 12:28:04 crc kubenswrapper[4693]: I1125 12:28:04.250034 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c8d5b9fc-mp5vw"] Nov 25 12:28:04 crc kubenswrapper[4693]: I1125 12:28:04.261509 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76c8d5b9fc-mp5vw"] Nov 25 12:28:04 crc kubenswrapper[4693]: I1125 12:28:04.824730 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d599a55-d080-44b9-b2e3-1f94e1724ad6" path="/var/lib/kubelet/pods/4d599a55-d080-44b9-b2e3-1f94e1724ad6/volumes" Nov 25 12:28:05 crc kubenswrapper[4693]: I1125 12:28:05.114157 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:28:05 crc kubenswrapper[4693]: I1125 12:28:05.114231 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:28:05 crc kubenswrapper[4693]: I1125 12:28:05.565847 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:28:05 crc kubenswrapper[4693]: I1125 12:28:05.566597 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7bf98548b6-68m92" Nov 25 12:28:05 crc kubenswrapper[4693]: I1125 12:28:05.990728 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:28:06 crc kubenswrapper[4693]: I1125 12:28:06.959256 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5df97c965f-mdrk8"] Nov 25 12:28:06 crc kubenswrapper[4693]: E1125 12:28:06.959623 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d599a55-d080-44b9-b2e3-1f94e1724ad6" containerName="init" Nov 25 12:28:06 crc kubenswrapper[4693]: I1125 12:28:06.959635 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d599a55-d080-44b9-b2e3-1f94e1724ad6" containerName="init" Nov 25 12:28:06 crc kubenswrapper[4693]: E1125 12:28:06.959653 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d599a55-d080-44b9-b2e3-1f94e1724ad6" containerName="dnsmasq-dns" Nov 25 12:28:06 crc kubenswrapper[4693]: I1125 12:28:06.959659 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d599a55-d080-44b9-b2e3-1f94e1724ad6" containerName="dnsmasq-dns" Nov 25 12:28:06 crc kubenswrapper[4693]: I1125 12:28:06.959847 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d599a55-d080-44b9-b2e3-1f94e1724ad6" containerName="dnsmasq-dns" Nov 25 12:28:06 crc kubenswrapper[4693]: I1125 12:28:06.960746 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:06 crc kubenswrapper[4693]: I1125 12:28:06.964609 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 25 12:28:06 crc kubenswrapper[4693]: I1125 12:28:06.964779 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 25 12:28:06 crc kubenswrapper[4693]: I1125 12:28:06.964920 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.007301 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5df97c965f-mdrk8"] Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.068552 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf5pn\" (UniqueName: \"kubernetes.io/projected/3bce19a4-5298-4024-b291-19e2d6138081-kube-api-access-wf5pn\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.068637 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bce19a4-5298-4024-b291-19e2d6138081-config-data\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.068760 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bce19a4-5298-4024-b291-19e2d6138081-log-httpd\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.068811 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bce19a4-5298-4024-b291-19e2d6138081-internal-tls-certs\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.068945 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bce19a4-5298-4024-b291-19e2d6138081-run-httpd\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.068967 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bce19a4-5298-4024-b291-19e2d6138081-public-tls-certs\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.069113 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bce19a4-5298-4024-b291-19e2d6138081-etc-swift\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.069137 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bce19a4-5298-4024-b291-19e2d6138081-combined-ca-bundle\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.170359 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bce19a4-5298-4024-b291-19e2d6138081-etc-swift\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.170422 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bce19a4-5298-4024-b291-19e2d6138081-combined-ca-bundle\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.170473 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf5pn\" (UniqueName: \"kubernetes.io/projected/3bce19a4-5298-4024-b291-19e2d6138081-kube-api-access-wf5pn\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.170518 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bce19a4-5298-4024-b291-19e2d6138081-config-data\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.170558 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bce19a4-5298-4024-b291-19e2d6138081-log-httpd\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.170572 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bce19a4-5298-4024-b291-19e2d6138081-internal-tls-certs\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.170617 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bce19a4-5298-4024-b291-19e2d6138081-run-httpd\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.170646 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bce19a4-5298-4024-b291-19e2d6138081-public-tls-certs\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.171160 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bce19a4-5298-4024-b291-19e2d6138081-log-httpd\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.171180 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bce19a4-5298-4024-b291-19e2d6138081-run-httpd\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.176417 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bce19a4-5298-4024-b291-19e2d6138081-combined-ca-bundle\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.176495 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bce19a4-5298-4024-b291-19e2d6138081-etc-swift\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.177067 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bce19a4-5298-4024-b291-19e2d6138081-internal-tls-certs\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.177713 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bce19a4-5298-4024-b291-19e2d6138081-public-tls-certs\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.178801 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bce19a4-5298-4024-b291-19e2d6138081-config-data\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.192917 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf5pn\" (UniqueName: \"kubernetes.io/projected/3bce19a4-5298-4024-b291-19e2d6138081-kube-api-access-wf5pn\") pod \"swift-proxy-5df97c965f-mdrk8\" (UID: \"3bce19a4-5298-4024-b291-19e2d6138081\") " pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.279172 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:07 crc kubenswrapper[4693]: I1125 12:28:07.908116 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5df97c965f-mdrk8"] Nov 25 12:28:07 crc kubenswrapper[4693]: W1125 12:28:07.939251 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bce19a4_5298_4024_b291_19e2d6138081.slice/crio-670a6d60a21fc78c0b2d1c539e1c467f824d5ec308358fcf416b59e4fbf19936 WatchSource:0}: Error finding container 670a6d60a21fc78c0b2d1c539e1c467f824d5ec308358fcf416b59e4fbf19936: Status 404 returned error can't find the container with id 670a6d60a21fc78c0b2d1c539e1c467f824d5ec308358fcf416b59e4fbf19936 Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.053986 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-574fd6fdfd-bz6sm" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.123550 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bbcbd4584-78jln"] Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.124241 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7bbcbd4584-78jln" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" containerName="horizon-log" containerID="cri-o://5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38" gracePeriod=30 Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.125004 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7bbcbd4584-78jln" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" containerName="horizon" containerID="cri-o://d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f" gracePeriod=30 Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.235809 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.237288 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.243022 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-rzw75" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.244480 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.245592 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.268459 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.298632 4693 generic.go:334] "Generic (PLEG): container finished" podID="dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" containerID="4dbcf025859eb1880605762156ae96e1c45ab3b7310d4464478b81cd2072a4e2" exitCode=0 Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.298711 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a","Type":"ContainerDied","Data":"4dbcf025859eb1880605762156ae96e1c45ab3b7310d4464478b81cd2072a4e2"} Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.306548 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d51f97b0-16ac-43b8-aa77-b2a66faef2cd-openstack-config\") pod \"openstackclient\" (UID: \"d51f97b0-16ac-43b8-aa77-b2a66faef2cd\") " pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.306611 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d51f97b0-16ac-43b8-aa77-b2a66faef2cd-openstack-config-secret\") pod \"openstackclient\" (UID: \"d51f97b0-16ac-43b8-aa77-b2a66faef2cd\") " pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.306655 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn9rq\" (UniqueName: \"kubernetes.io/projected/d51f97b0-16ac-43b8-aa77-b2a66faef2cd-kube-api-access-rn9rq\") pod \"openstackclient\" (UID: \"d51f97b0-16ac-43b8-aa77-b2a66faef2cd\") " pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.306675 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d51f97b0-16ac-43b8-aa77-b2a66faef2cd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d51f97b0-16ac-43b8-aa77-b2a66faef2cd\") " pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.322820 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5df97c965f-mdrk8" event={"ID":"3bce19a4-5298-4024-b291-19e2d6138081","Type":"ContainerStarted","Data":"670a6d60a21fc78c0b2d1c539e1c467f824d5ec308358fcf416b59e4fbf19936"} Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.415690 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d51f97b0-16ac-43b8-aa77-b2a66faef2cd-openstack-config\") pod \"openstackclient\" (UID: \"d51f97b0-16ac-43b8-aa77-b2a66faef2cd\") " pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.415765 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d51f97b0-16ac-43b8-aa77-b2a66faef2cd-openstack-config-secret\") pod \"openstackclient\" (UID: \"d51f97b0-16ac-43b8-aa77-b2a66faef2cd\") " pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.417109 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d51f97b0-16ac-43b8-aa77-b2a66faef2cd-openstack-config\") pod \"openstackclient\" (UID: \"d51f97b0-16ac-43b8-aa77-b2a66faef2cd\") " pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.430032 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d51f97b0-16ac-43b8-aa77-b2a66faef2cd-openstack-config-secret\") pod \"openstackclient\" (UID: \"d51f97b0-16ac-43b8-aa77-b2a66faef2cd\") " pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.439470 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn9rq\" (UniqueName: \"kubernetes.io/projected/d51f97b0-16ac-43b8-aa77-b2a66faef2cd-kube-api-access-rn9rq\") pod \"openstackclient\" (UID: \"d51f97b0-16ac-43b8-aa77-b2a66faef2cd\") " pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.439523 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d51f97b0-16ac-43b8-aa77-b2a66faef2cd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d51f97b0-16ac-43b8-aa77-b2a66faef2cd\") " pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.457212 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d51f97b0-16ac-43b8-aa77-b2a66faef2cd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d51f97b0-16ac-43b8-aa77-b2a66faef2cd\") " pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.462819 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn9rq\" (UniqueName: \"kubernetes.io/projected/d51f97b0-16ac-43b8-aa77-b2a66faef2cd-kube-api-access-rn9rq\") pod \"openstackclient\" (UID: \"d51f97b0-16ac-43b8-aa77-b2a66faef2cd\") " pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.561470 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.572871 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.642775 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-config-data-custom\") pod \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.644412 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-scripts\") pod \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.644478 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-combined-ca-bundle\") pod \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.644517 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-config-data\") pod \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.645357 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-etc-machine-id\") pod \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.645449 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fllqm\" (UniqueName: \"kubernetes.io/projected/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-kube-api-access-fllqm\") pod \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\" (UID: \"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a\") " Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.647444 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" (UID: "dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.651341 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" (UID: "dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.658793 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-kube-api-access-fllqm" (OuterVolumeSpecName: "kube-api-access-fllqm") pod "dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" (UID: "dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a"). InnerVolumeSpecName "kube-api-access-fllqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.659064 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-scripts" (OuterVolumeSpecName: "scripts") pod "dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" (UID: "dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.724512 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" (UID: "dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.754789 4693 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.754814 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.754823 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.754831 4693 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.754840 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fllqm\" (UniqueName: \"kubernetes.io/projected/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-kube-api-access-fllqm\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.789497 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-config-data" (OuterVolumeSpecName: "config-data") pod "dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" (UID: "dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:08 crc kubenswrapper[4693]: I1125 12:28:08.858707 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.155054 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.350139 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5df97c965f-mdrk8" event={"ID":"3bce19a4-5298-4024-b291-19e2d6138081","Type":"ContainerStarted","Data":"32ed41f3805dd71bd57aa40d5661ef1373d04a712948dcfe66b22ac5fd4c7982"} Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.350229 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5df97c965f-mdrk8" event={"ID":"3bce19a4-5298-4024-b291-19e2d6138081","Type":"ContainerStarted","Data":"3b27ffe3b360ee19f0d4063af7695cad6de2c76d9486590a003e85e62cd974db"} Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.350344 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.350653 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.352735 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d51f97b0-16ac-43b8-aa77-b2a66faef2cd","Type":"ContainerStarted","Data":"2169513cd11f44d8bbeb7ba162d1419d65df5aac0c2214aa67b02de6c21159f6"} Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.363620 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a","Type":"ContainerDied","Data":"1b5b43571192b932f03219da31097743a9c799a6098a545a56e8f6c192c5fb15"} Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.363693 4693 scope.go:117] "RemoveContainer" containerID="a2a9b8694e0ee2319282287d84825b4cde5ac4bd358c44f61561e1d7eda1096b" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.363728 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.375192 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5df97c965f-mdrk8" podStartSLOduration=3.375173817 podStartE2EDuration="3.375173817s" podCreationTimestamp="2025-11-25 12:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:28:09.371826523 +0000 UTC m=+1209.289911904" watchObservedRunningTime="2025-11-25 12:28:09.375173817 +0000 UTC m=+1209.293259198" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.392804 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.403569 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.409514 4693 scope.go:117] "RemoveContainer" containerID="4dbcf025859eb1880605762156ae96e1c45ab3b7310d4464478b81cd2072a4e2" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.433350 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 12:28:09 crc kubenswrapper[4693]: E1125 12:28:09.433833 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" containerName="cinder-scheduler" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.433857 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" containerName="cinder-scheduler" Nov 25 12:28:09 crc kubenswrapper[4693]: E1125 12:28:09.433873 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" containerName="probe" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.433881 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" containerName="probe" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.434134 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" containerName="cinder-scheduler" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.434167 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" containerName="probe" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.435264 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.441433 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.448490 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.570508 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2t2x\" (UniqueName: \"kubernetes.io/projected/42d5e91d-841b-453a-a5db-f2d1bf40fbec-kube-api-access-w2t2x\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.570601 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d5e91d-841b-453a-a5db-f2d1bf40fbec-scripts\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.570631 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42d5e91d-841b-453a-a5db-f2d1bf40fbec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.570748 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d5e91d-841b-453a-a5db-f2d1bf40fbec-config-data\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.570773 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d5e91d-841b-453a-a5db-f2d1bf40fbec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.570849 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d5e91d-841b-453a-a5db-f2d1bf40fbec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.672598 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d5e91d-841b-453a-a5db-f2d1bf40fbec-config-data\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.672654 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d5e91d-841b-453a-a5db-f2d1bf40fbec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.672740 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d5e91d-841b-453a-a5db-f2d1bf40fbec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.672771 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2t2x\" (UniqueName: \"kubernetes.io/projected/42d5e91d-841b-453a-a5db-f2d1bf40fbec-kube-api-access-w2t2x\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.672808 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d5e91d-841b-453a-a5db-f2d1bf40fbec-scripts\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.672825 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42d5e91d-841b-453a-a5db-f2d1bf40fbec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.672897 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42d5e91d-841b-453a-a5db-f2d1bf40fbec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.678515 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d5e91d-841b-453a-a5db-f2d1bf40fbec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.678952 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d5e91d-841b-453a-a5db-f2d1bf40fbec-config-data\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.678998 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d5e91d-841b-453a-a5db-f2d1bf40fbec-scripts\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.681357 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d5e91d-841b-453a-a5db-f2d1bf40fbec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.701038 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2t2x\" (UniqueName: \"kubernetes.io/projected/42d5e91d-841b-453a-a5db-f2d1bf40fbec-kube-api-access-w2t2x\") pod \"cinder-scheduler-0\" (UID: \"42d5e91d-841b-453a-a5db-f2d1bf40fbec\") " pod="openstack/cinder-scheduler-0" Nov 25 12:28:09 crc kubenswrapper[4693]: I1125 12:28:09.762726 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.065460 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.065934 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="ceilometer-central-agent" containerID="cri-o://0a07f378bdf5f11d549e1b6c7b6f53f411963a48b4407744f762aca7905fc9f4" gracePeriod=30 Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.066455 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="proxy-httpd" containerID="cri-o://9324239db46fe8a9ef39280b7f2d83c67bd77feeaff39d2316a74fe36197055e" gracePeriod=30 Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.066536 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="sg-core" containerID="cri-o://9fe7c42f488b278c479461f0c4076420123053701d3ec2aff03e9d259fe36b65" gracePeriod=30 Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.066583 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="ceilometer-notification-agent" containerID="cri-o://3cf88dcc2049c6ad6f192ea88943ed16f2524e3a00b421e237b0f9b3608a6f93" gracePeriod=30 Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.100736 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.156:3000/\": EOF" Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.418621 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.426671 4693 generic.go:334] "Generic (PLEG): container finished" podID="95762039-b403-4702-9c63-025018a0d833" containerID="9324239db46fe8a9ef39280b7f2d83c67bd77feeaff39d2316a74fe36197055e" exitCode=0 Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.426714 4693 generic.go:334] "Generic (PLEG): container finished" podID="95762039-b403-4702-9c63-025018a0d833" containerID="9fe7c42f488b278c479461f0c4076420123053701d3ec2aff03e9d259fe36b65" exitCode=2 Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.426792 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95762039-b403-4702-9c63-025018a0d833","Type":"ContainerDied","Data":"9324239db46fe8a9ef39280b7f2d83c67bd77feeaff39d2316a74fe36197055e"} Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.426823 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95762039-b403-4702-9c63-025018a0d833","Type":"ContainerDied","Data":"9fe7c42f488b278c479461f0c4076420123053701d3ec2aff03e9d259fe36b65"} Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.491443 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:28:10 crc kubenswrapper[4693]: I1125 12:28:10.829580 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a" path="/var/lib/kubelet/pods/dd00fccc-6a75-4f2a-a950-4e6b6ea3fa2a/volumes" Nov 25 12:28:11 crc kubenswrapper[4693]: I1125 12:28:11.073927 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c49f9f854-clbv4" Nov 25 12:28:11 crc kubenswrapper[4693]: I1125 12:28:11.168924 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75fb97c486-4w76r"] Nov 25 12:28:11 crc kubenswrapper[4693]: I1125 12:28:11.191135 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75fb97c486-4w76r" podUID="d1342b7d-3a59-4236-9673-f0b377a5657d" containerName="barbican-api-log" containerID="cri-o://ad0bf7300e1387eaba5190c6c5d1df54634ae87a3c146b97422f5dae5aff3653" gracePeriod=30 Nov 25 12:28:11 crc kubenswrapper[4693]: I1125 12:28:11.192204 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75fb97c486-4w76r" podUID="d1342b7d-3a59-4236-9673-f0b377a5657d" containerName="barbican-api" containerID="cri-o://60629e8736cd7781cad2195e7d50a379bdfd10cacd5131cbd36cb0edc07606f7" gracePeriod=30 Nov 25 12:28:11 crc kubenswrapper[4693]: I1125 12:28:11.480236 4693 generic.go:334] "Generic (PLEG): container finished" podID="95762039-b403-4702-9c63-025018a0d833" containerID="0a07f378bdf5f11d549e1b6c7b6f53f411963a48b4407744f762aca7905fc9f4" exitCode=0 Nov 25 12:28:11 crc kubenswrapper[4693]: I1125 12:28:11.480425 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95762039-b403-4702-9c63-025018a0d833","Type":"ContainerDied","Data":"0a07f378bdf5f11d549e1b6c7b6f53f411963a48b4407744f762aca7905fc9f4"} Nov 25 12:28:11 crc kubenswrapper[4693]: I1125 12:28:11.495424 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"42d5e91d-841b-453a-a5db-f2d1bf40fbec","Type":"ContainerStarted","Data":"82bdee6dbf9d9a431b886af37676e94d30c647f32ca40458c84f38c560864711"} Nov 25 12:28:11 crc kubenswrapper[4693]: I1125 12:28:11.511246 4693 generic.go:334] "Generic (PLEG): container finished" podID="d1342b7d-3a59-4236-9673-f0b377a5657d" containerID="ad0bf7300e1387eaba5190c6c5d1df54634ae87a3c146b97422f5dae5aff3653" exitCode=143 Nov 25 12:28:11 crc kubenswrapper[4693]: I1125 12:28:11.512666 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75fb97c486-4w76r" event={"ID":"d1342b7d-3a59-4236-9673-f0b377a5657d","Type":"ContainerDied","Data":"ad0bf7300e1387eaba5190c6c5d1df54634ae87a3c146b97422f5dae5aff3653"} Nov 25 12:28:12 crc kubenswrapper[4693]: I1125 12:28:12.526622 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"42d5e91d-841b-453a-a5db-f2d1bf40fbec","Type":"ContainerStarted","Data":"522b14ccdffa5b9982baf79f12a8826df1e1766a9b4d09afcc06441a488157c6"} Nov 25 12:28:12 crc kubenswrapper[4693]: I1125 12:28:12.526976 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"42d5e91d-841b-453a-a5db-f2d1bf40fbec","Type":"ContainerStarted","Data":"b2a18c4ba32afe29960a3b195a5d07728226c89b79a7e4b6cf31c9bcad423244"} Nov 25 12:28:12 crc kubenswrapper[4693]: I1125 12:28:12.531221 4693 generic.go:334] "Generic (PLEG): container finished" podID="1f60abf7-3c23-4174-9150-50061c054cf5" containerID="d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f" exitCode=0 Nov 25 12:28:12 crc kubenswrapper[4693]: I1125 12:28:12.531273 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbcbd4584-78jln" event={"ID":"1f60abf7-3c23-4174-9150-50061c054cf5","Type":"ContainerDied","Data":"d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f"} Nov 25 12:28:12 crc kubenswrapper[4693]: I1125 12:28:12.557822 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.557795316 podStartE2EDuration="3.557795316s" podCreationTimestamp="2025-11-25 12:28:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:28:12.549350946 +0000 UTC m=+1212.467436347" watchObservedRunningTime="2025-11-25 12:28:12.557795316 +0000 UTC m=+1212.475880697" Nov 25 12:28:12 crc kubenswrapper[4693]: I1125 12:28:12.565687 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bbcbd4584-78jln" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Nov 25 12:28:12 crc kubenswrapper[4693]: I1125 12:28:12.985685 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 25 12:28:13 crc kubenswrapper[4693]: I1125 12:28:13.545237 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95762039-b403-4702-9c63-025018a0d833","Type":"ContainerDied","Data":"3cf88dcc2049c6ad6f192ea88943ed16f2524e3a00b421e237b0f9b3608a6f93"} Nov 25 12:28:13 crc kubenswrapper[4693]: I1125 12:28:13.545129 4693 generic.go:334] "Generic (PLEG): container finished" podID="95762039-b403-4702-9c63-025018a0d833" containerID="3cf88dcc2049c6ad6f192ea88943ed16f2524e3a00b421e237b0f9b3608a6f93" exitCode=0 Nov 25 12:28:13 crc kubenswrapper[4693]: I1125 12:28:13.996925 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.034540 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-combined-ca-bundle\") pod \"95762039-b403-4702-9c63-025018a0d833\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.034585 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2m6h\" (UniqueName: \"kubernetes.io/projected/95762039-b403-4702-9c63-025018a0d833-kube-api-access-l2m6h\") pod \"95762039-b403-4702-9c63-025018a0d833\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.034651 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95762039-b403-4702-9c63-025018a0d833-run-httpd\") pod \"95762039-b403-4702-9c63-025018a0d833\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.034711 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-config-data\") pod \"95762039-b403-4702-9c63-025018a0d833\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.035052 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-sg-core-conf-yaml\") pod \"95762039-b403-4702-9c63-025018a0d833\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.035098 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-scripts\") pod \"95762039-b403-4702-9c63-025018a0d833\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.035138 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95762039-b403-4702-9c63-025018a0d833-log-httpd\") pod \"95762039-b403-4702-9c63-025018a0d833\" (UID: \"95762039-b403-4702-9c63-025018a0d833\") " Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.035510 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95762039-b403-4702-9c63-025018a0d833-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "95762039-b403-4702-9c63-025018a0d833" (UID: "95762039-b403-4702-9c63-025018a0d833"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.036314 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95762039-b403-4702-9c63-025018a0d833-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "95762039-b403-4702-9c63-025018a0d833" (UID: "95762039-b403-4702-9c63-025018a0d833"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.055580 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-scripts" (OuterVolumeSpecName: "scripts") pod "95762039-b403-4702-9c63-025018a0d833" (UID: "95762039-b403-4702-9c63-025018a0d833"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.063519 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95762039-b403-4702-9c63-025018a0d833-kube-api-access-l2m6h" (OuterVolumeSpecName: "kube-api-access-l2m6h") pod "95762039-b403-4702-9c63-025018a0d833" (UID: "95762039-b403-4702-9c63-025018a0d833"). InnerVolumeSpecName "kube-api-access-l2m6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.077543 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "95762039-b403-4702-9c63-025018a0d833" (UID: "95762039-b403-4702-9c63-025018a0d833"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.136718 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95762039-b403-4702-9c63-025018a0d833-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.136750 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.136761 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.136771 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95762039-b403-4702-9c63-025018a0d833-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.136779 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2m6h\" (UniqueName: \"kubernetes.io/projected/95762039-b403-4702-9c63-025018a0d833-kube-api-access-l2m6h\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.148902 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95762039-b403-4702-9c63-025018a0d833" (UID: "95762039-b403-4702-9c63-025018a0d833"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.170615 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-config-data" (OuterVolumeSpecName: "config-data") pod "95762039-b403-4702-9c63-025018a0d833" (UID: "95762039-b403-4702-9c63-025018a0d833"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.238751 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.238799 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95762039-b403-4702-9c63-025018a0d833-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.459772 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75fb97c486-4w76r" podUID="d1342b7d-3a59-4236-9673-f0b377a5657d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:35246->10.217.0.162:9311: read: connection reset by peer" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.459772 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75fb97c486-4w76r" podUID="d1342b7d-3a59-4236-9673-f0b377a5657d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:35244->10.217.0.162:9311: read: connection reset by peer" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.556851 4693 generic.go:334] "Generic (PLEG): container finished" podID="d1342b7d-3a59-4236-9673-f0b377a5657d" containerID="60629e8736cd7781cad2195e7d50a379bdfd10cacd5131cbd36cb0edc07606f7" exitCode=0 Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.556923 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75fb97c486-4w76r" event={"ID":"d1342b7d-3a59-4236-9673-f0b377a5657d","Type":"ContainerDied","Data":"60629e8736cd7781cad2195e7d50a379bdfd10cacd5131cbd36cb0edc07606f7"} Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.561754 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95762039-b403-4702-9c63-025018a0d833","Type":"ContainerDied","Data":"16990bd1384f409a258f9cf9079354c3a5e0f2c6159cac42a2e692094cf4da87"} Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.561790 4693 scope.go:117] "RemoveContainer" containerID="9324239db46fe8a9ef39280b7f2d83c67bd77feeaff39d2316a74fe36197055e" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.561903 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.620876 4693 scope.go:117] "RemoveContainer" containerID="9fe7c42f488b278c479461f0c4076420123053701d3ec2aff03e9d259fe36b65" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.628443 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.643585 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.662451 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:14 crc kubenswrapper[4693]: E1125 12:28:14.662889 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="ceilometer-central-agent" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.662907 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="ceilometer-central-agent" Nov 25 12:28:14 crc kubenswrapper[4693]: E1125 12:28:14.662922 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="sg-core" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.662928 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="sg-core" Nov 25 12:28:14 crc kubenswrapper[4693]: E1125 12:28:14.662958 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="proxy-httpd" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.662964 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="proxy-httpd" Nov 25 12:28:14 crc kubenswrapper[4693]: E1125 12:28:14.662976 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="ceilometer-notification-agent" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.662981 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="ceilometer-notification-agent" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.663138 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="sg-core" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.663156 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="ceilometer-central-agent" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.663167 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="proxy-httpd" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.663177 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="95762039-b403-4702-9c63-025018a0d833" containerName="ceilometer-notification-agent" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.664742 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.668289 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.668716 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.670279 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.731310 4693 scope.go:117] "RemoveContainer" containerID="3cf88dcc2049c6ad6f192ea88943ed16f2524e3a00b421e237b0f9b3608a6f93" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.747867 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-scripts\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.747924 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-log-httpd\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.747995 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-run-httpd\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.748050 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.748093 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.748130 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sktt4\" (UniqueName: \"kubernetes.io/projected/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-kube-api-access-sktt4\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.748209 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-config-data\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.764313 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.773630 4693 scope.go:117] "RemoveContainer" containerID="0a07f378bdf5f11d549e1b6c7b6f53f411963a48b4407744f762aca7905fc9f4" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.829356 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95762039-b403-4702-9c63-025018a0d833" path="/var/lib/kubelet/pods/95762039-b403-4702-9c63-025018a0d833/volumes" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.850000 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-run-httpd\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.850153 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.850227 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.850275 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sktt4\" (UniqueName: \"kubernetes.io/projected/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-kube-api-access-sktt4\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.850307 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-config-data\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.850414 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-scripts\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.850444 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-log-httpd\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.852705 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-run-httpd\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.857103 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-log-httpd\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.859790 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.870994 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.871565 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sktt4\" (UniqueName: \"kubernetes.io/projected/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-kube-api-access-sktt4\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.874851 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-config-data\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:14 crc kubenswrapper[4693]: I1125 12:28:14.895721 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-scripts\") pod \"ceilometer-0\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " pod="openstack/ceilometer-0" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.011502 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.101055 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.156775 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1342b7d-3a59-4236-9673-f0b377a5657d-logs\") pod \"d1342b7d-3a59-4236-9673-f0b377a5657d\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.156842 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-config-data-custom\") pod \"d1342b7d-3a59-4236-9673-f0b377a5657d\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.156933 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm4k4\" (UniqueName: \"kubernetes.io/projected/d1342b7d-3a59-4236-9673-f0b377a5657d-kube-api-access-wm4k4\") pod \"d1342b7d-3a59-4236-9673-f0b377a5657d\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.157057 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-combined-ca-bundle\") pod \"d1342b7d-3a59-4236-9673-f0b377a5657d\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.157084 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-config-data\") pod \"d1342b7d-3a59-4236-9673-f0b377a5657d\" (UID: \"d1342b7d-3a59-4236-9673-f0b377a5657d\") " Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.164907 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1342b7d-3a59-4236-9673-f0b377a5657d-logs" (OuterVolumeSpecName: "logs") pod "d1342b7d-3a59-4236-9673-f0b377a5657d" (UID: "d1342b7d-3a59-4236-9673-f0b377a5657d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.169569 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1342b7d-3a59-4236-9673-f0b377a5657d-kube-api-access-wm4k4" (OuterVolumeSpecName: "kube-api-access-wm4k4") pod "d1342b7d-3a59-4236-9673-f0b377a5657d" (UID: "d1342b7d-3a59-4236-9673-f0b377a5657d"). InnerVolumeSpecName "kube-api-access-wm4k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.174524 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d1342b7d-3a59-4236-9673-f0b377a5657d" (UID: "d1342b7d-3a59-4236-9673-f0b377a5657d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.211907 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1342b7d-3a59-4236-9673-f0b377a5657d" (UID: "d1342b7d-3a59-4236-9673-f0b377a5657d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.220646 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-config-data" (OuterVolumeSpecName: "config-data") pod "d1342b7d-3a59-4236-9673-f0b377a5657d" (UID: "d1342b7d-3a59-4236-9673-f0b377a5657d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.259127 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.259168 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.259180 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1342b7d-3a59-4236-9673-f0b377a5657d-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.259196 4693 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1342b7d-3a59-4236-9673-f0b377a5657d-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.259207 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm4k4\" (UniqueName: \"kubernetes.io/projected/d1342b7d-3a59-4236-9673-f0b377a5657d-kube-api-access-wm4k4\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.581146 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75fb97c486-4w76r" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.581906 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75fb97c486-4w76r" event={"ID":"d1342b7d-3a59-4236-9673-f0b377a5657d","Type":"ContainerDied","Data":"d7961009b26615ac257554386c2cab8bd2605925fa87b1ebc1ec4bdf2941f3cd"} Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.581958 4693 scope.go:117] "RemoveContainer" containerID="60629e8736cd7781cad2195e7d50a379bdfd10cacd5131cbd36cb0edc07606f7" Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.623199 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.635342 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75fb97c486-4w76r"] Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.646747 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-75fb97c486-4w76r"] Nov 25 12:28:15 crc kubenswrapper[4693]: I1125 12:28:15.653168 4693 scope.go:117] "RemoveContainer" containerID="ad0bf7300e1387eaba5190c6c5d1df54634ae87a3c146b97422f5dae5aff3653" Nov 25 12:28:16 crc kubenswrapper[4693]: I1125 12:28:16.622942 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60b0e117-e476-4ea8-b9e8-6cd21f6917a9","Type":"ContainerStarted","Data":"1d5ab95e24871d2d267a12c36d7d1cb3a7dbfc0d6706950d9e23aace166d98ea"} Nov 25 12:28:16 crc kubenswrapper[4693]: I1125 12:28:16.623249 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60b0e117-e476-4ea8-b9e8-6cd21f6917a9","Type":"ContainerStarted","Data":"931243106c248910fa18441a9a5d6af283b0e04986e9b34908211b8c726690d1"} Nov 25 12:28:16 crc kubenswrapper[4693]: I1125 12:28:16.834713 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1342b7d-3a59-4236-9673-f0b377a5657d" path="/var/lib/kubelet/pods/d1342b7d-3a59-4236-9673-f0b377a5657d/volumes" Nov 25 12:28:17 crc kubenswrapper[4693]: I1125 12:28:17.290862 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:17 crc kubenswrapper[4693]: I1125 12:28:17.297400 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5df97c965f-mdrk8" Nov 25 12:28:17 crc kubenswrapper[4693]: I1125 12:28:17.451944 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:28:17 crc kubenswrapper[4693]: I1125 12:28:17.637302 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60b0e117-e476-4ea8-b9e8-6cd21f6917a9","Type":"ContainerStarted","Data":"512b6bb91b98ceca522e017dbcd8c7c443da9a5f551bb0b9473195828175809a"} Nov 25 12:28:18 crc kubenswrapper[4693]: I1125 12:28:18.508229 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:19 crc kubenswrapper[4693]: I1125 12:28:19.466960 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57c497f557-r9sp7" Nov 25 12:28:19 crc kubenswrapper[4693]: I1125 12:28:19.533802 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5545f6bb6d-vdngh"] Nov 25 12:28:19 crc kubenswrapper[4693]: I1125 12:28:19.534049 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5545f6bb6d-vdngh" podUID="6921c173-adf8-47d6-9e9b-98657a453bdd" containerName="neutron-api" containerID="cri-o://a5e1cedfcb4eb196cf8cacbccc23aae6033bf7273ba1de2cf3d0818143c00a2c" gracePeriod=30 Nov 25 12:28:19 crc kubenswrapper[4693]: I1125 12:28:19.534163 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5545f6bb6d-vdngh" podUID="6921c173-adf8-47d6-9e9b-98657a453bdd" containerName="neutron-httpd" containerID="cri-o://bdb58e6825c07b11690e9e20ed28fd650708d53911c39bfd49c42098e1282447" gracePeriod=30 Nov 25 12:28:20 crc kubenswrapper[4693]: I1125 12:28:20.024057 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 25 12:28:20 crc kubenswrapper[4693]: I1125 12:28:20.677646 4693 generic.go:334] "Generic (PLEG): container finished" podID="6921c173-adf8-47d6-9e9b-98657a453bdd" containerID="bdb58e6825c07b11690e9e20ed28fd650708d53911c39bfd49c42098e1282447" exitCode=0 Nov 25 12:28:20 crc kubenswrapper[4693]: I1125 12:28:20.677866 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5545f6bb6d-vdngh" event={"ID":"6921c173-adf8-47d6-9e9b-98657a453bdd","Type":"ContainerDied","Data":"bdb58e6825c07b11690e9e20ed28fd650708d53911c39bfd49c42098e1282447"} Nov 25 12:28:21 crc kubenswrapper[4693]: I1125 12:28:21.693672 4693 generic.go:334] "Generic (PLEG): container finished" podID="6921c173-adf8-47d6-9e9b-98657a453bdd" containerID="a5e1cedfcb4eb196cf8cacbccc23aae6033bf7273ba1de2cf3d0818143c00a2c" exitCode=0 Nov 25 12:28:21 crc kubenswrapper[4693]: I1125 12:28:21.693723 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5545f6bb6d-vdngh" event={"ID":"6921c173-adf8-47d6-9e9b-98657a453bdd","Type":"ContainerDied","Data":"a5e1cedfcb4eb196cf8cacbccc23aae6033bf7273ba1de2cf3d0818143c00a2c"} Nov 25 12:28:21 crc kubenswrapper[4693]: I1125 12:28:21.788979 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:28:21 crc kubenswrapper[4693]: I1125 12:28:21.789227 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a527ec6b-b211-43e1-afd2-6cfd2d60291a" containerName="glance-log" containerID="cri-o://95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f" gracePeriod=30 Nov 25 12:28:21 crc kubenswrapper[4693]: I1125 12:28:21.789319 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a527ec6b-b211-43e1-afd2-6cfd2d60291a" containerName="glance-httpd" containerID="cri-o://09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75" gracePeriod=30 Nov 25 12:28:22 crc kubenswrapper[4693]: I1125 12:28:22.565770 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bbcbd4584-78jln" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Nov 25 12:28:22 crc kubenswrapper[4693]: I1125 12:28:22.703943 4693 generic.go:334] "Generic (PLEG): container finished" podID="a527ec6b-b211-43e1-afd2-6cfd2d60291a" containerID="95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f" exitCode=143 Nov 25 12:28:22 crc kubenswrapper[4693]: I1125 12:28:22.703989 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a527ec6b-b211-43e1-afd2-6cfd2d60291a","Type":"ContainerDied","Data":"95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f"} Nov 25 12:28:23 crc kubenswrapper[4693]: I1125 12:28:23.455621 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:28:23 crc kubenswrapper[4693]: I1125 12:28:23.455869 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a3386396-6766-42b0-a683-af6f6c2da021" containerName="glance-log" containerID="cri-o://b4f5419c71b94019c4a7731f744eac9712e45c8cde2a446e4cffbfba4d3efbbb" gracePeriod=30 Nov 25 12:28:23 crc kubenswrapper[4693]: I1125 12:28:23.455944 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a3386396-6766-42b0-a683-af6f6c2da021" containerName="glance-httpd" containerID="cri-o://06de4c38e61999a12ce2fcb3bb3173374752e4183c7a3e7c975130c1e666ad25" gracePeriod=30 Nov 25 12:28:24 crc kubenswrapper[4693]: I1125 12:28:24.733827 4693 generic.go:334] "Generic (PLEG): container finished" podID="a3386396-6766-42b0-a683-af6f6c2da021" containerID="b4f5419c71b94019c4a7731f744eac9712e45c8cde2a446e4cffbfba4d3efbbb" exitCode=143 Nov 25 12:28:24 crc kubenswrapper[4693]: I1125 12:28:24.733927 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a3386396-6766-42b0-a683-af6f6c2da021","Type":"ContainerDied","Data":"b4f5419c71b94019c4a7731f744eac9712e45c8cde2a446e4cffbfba4d3efbbb"} Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.309127 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.364488 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4d6n\" (UniqueName: \"kubernetes.io/projected/6921c173-adf8-47d6-9e9b-98657a453bdd-kube-api-access-x4d6n\") pod \"6921c173-adf8-47d6-9e9b-98657a453bdd\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.364685 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-httpd-config\") pod \"6921c173-adf8-47d6-9e9b-98657a453bdd\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.364712 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-ovndb-tls-certs\") pod \"6921c173-adf8-47d6-9e9b-98657a453bdd\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.364758 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-config\") pod \"6921c173-adf8-47d6-9e9b-98657a453bdd\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.364801 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-combined-ca-bundle\") pod \"6921c173-adf8-47d6-9e9b-98657a453bdd\" (UID: \"6921c173-adf8-47d6-9e9b-98657a453bdd\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.372650 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6921c173-adf8-47d6-9e9b-98657a453bdd" (UID: "6921c173-adf8-47d6-9e9b-98657a453bdd"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.383694 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6921c173-adf8-47d6-9e9b-98657a453bdd-kube-api-access-x4d6n" (OuterVolumeSpecName: "kube-api-access-x4d6n") pod "6921c173-adf8-47d6-9e9b-98657a453bdd" (UID: "6921c173-adf8-47d6-9e9b-98657a453bdd"). InnerVolumeSpecName "kube-api-access-x4d6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.449983 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-config" (OuterVolumeSpecName: "config") pod "6921c173-adf8-47d6-9e9b-98657a453bdd" (UID: "6921c173-adf8-47d6-9e9b-98657a453bdd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.462041 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6921c173-adf8-47d6-9e9b-98657a453bdd" (UID: "6921c173-adf8-47d6-9e9b-98657a453bdd"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.464248 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6921c173-adf8-47d6-9e9b-98657a453bdd" (UID: "6921c173-adf8-47d6-9e9b-98657a453bdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.466463 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4d6n\" (UniqueName: \"kubernetes.io/projected/6921c173-adf8-47d6-9e9b-98657a453bdd-kube-api-access-x4d6n\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.466505 4693 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.466517 4693 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.466528 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.466539 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6921c173-adf8-47d6-9e9b-98657a453bdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.496667 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.567135 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.567224 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-public-tls-certs\") pod \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.567280 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a527ec6b-b211-43e1-afd2-6cfd2d60291a-httpd-run\") pod \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.567310 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a527ec6b-b211-43e1-afd2-6cfd2d60291a-logs\") pod \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.567448 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-combined-ca-bundle\") pod \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.567493 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-config-data\") pod \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.567540 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9xv7\" (UniqueName: \"kubernetes.io/projected/a527ec6b-b211-43e1-afd2-6cfd2d60291a-kube-api-access-h9xv7\") pod \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.567648 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-scripts\") pod \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\" (UID: \"a527ec6b-b211-43e1-afd2-6cfd2d60291a\") " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.569277 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a527ec6b-b211-43e1-afd2-6cfd2d60291a-logs" (OuterVolumeSpecName: "logs") pod "a527ec6b-b211-43e1-afd2-6cfd2d60291a" (UID: "a527ec6b-b211-43e1-afd2-6cfd2d60291a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.569606 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a527ec6b-b211-43e1-afd2-6cfd2d60291a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a527ec6b-b211-43e1-afd2-6cfd2d60291a" (UID: "a527ec6b-b211-43e1-afd2-6cfd2d60291a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.571773 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-scripts" (OuterVolumeSpecName: "scripts") pod "a527ec6b-b211-43e1-afd2-6cfd2d60291a" (UID: "a527ec6b-b211-43e1-afd2-6cfd2d60291a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.576879 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "a527ec6b-b211-43e1-afd2-6cfd2d60291a" (UID: "a527ec6b-b211-43e1-afd2-6cfd2d60291a"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.578551 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a527ec6b-b211-43e1-afd2-6cfd2d60291a-kube-api-access-h9xv7" (OuterVolumeSpecName: "kube-api-access-h9xv7") pod "a527ec6b-b211-43e1-afd2-6cfd2d60291a" (UID: "a527ec6b-b211-43e1-afd2-6cfd2d60291a"). InnerVolumeSpecName "kube-api-access-h9xv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.615028 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a527ec6b-b211-43e1-afd2-6cfd2d60291a" (UID: "a527ec6b-b211-43e1-afd2-6cfd2d60291a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.628710 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a527ec6b-b211-43e1-afd2-6cfd2d60291a" (UID: "a527ec6b-b211-43e1-afd2-6cfd2d60291a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.629092 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-config-data" (OuterVolumeSpecName: "config-data") pod "a527ec6b-b211-43e1-afd2-6cfd2d60291a" (UID: "a527ec6b-b211-43e1-afd2-6cfd2d60291a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.669554 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.669609 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.669621 4693 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.669632 4693 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a527ec6b-b211-43e1-afd2-6cfd2d60291a-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.669641 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a527ec6b-b211-43e1-afd2-6cfd2d60291a-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.669649 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.669659 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a527ec6b-b211-43e1-afd2-6cfd2d60291a-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.669668 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9xv7\" (UniqueName: \"kubernetes.io/projected/a527ec6b-b211-43e1-afd2-6cfd2d60291a-kube-api-access-h9xv7\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.692078 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.746989 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60b0e117-e476-4ea8-b9e8-6cd21f6917a9","Type":"ContainerStarted","Data":"ac9f65daac0a61ea1a80f60098a6d76f93fd6845c2ae9022577207dde3663cf4"} Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.749970 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d51f97b0-16ac-43b8-aa77-b2a66faef2cd","Type":"ContainerStarted","Data":"f8952e547f556e2be1ac5d6801688d35897d874d56d2b614f6dfc76f2c7e0864"} Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.757824 4693 generic.go:334] "Generic (PLEG): container finished" podID="a527ec6b-b211-43e1-afd2-6cfd2d60291a" containerID="09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75" exitCode=0 Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.757932 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a527ec6b-b211-43e1-afd2-6cfd2d60291a","Type":"ContainerDied","Data":"09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75"} Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.757970 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a527ec6b-b211-43e1-afd2-6cfd2d60291a","Type":"ContainerDied","Data":"04959e2365e811a09859ef558735122dfd01b478e0d2b21ba289f4001033c099"} Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.757998 4693 scope.go:117] "RemoveContainer" containerID="09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.758153 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.762474 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5545f6bb6d-vdngh" event={"ID":"6921c173-adf8-47d6-9e9b-98657a453bdd","Type":"ContainerDied","Data":"fcf57349ab784191df0b0a745d9f2d2bab70f9882baa88d392bfd203b86a8bb6"} Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.762568 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5545f6bb6d-vdngh" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.771519 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.778945 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.955532332 podStartE2EDuration="17.778903802s" podCreationTimestamp="2025-11-25 12:28:08 +0000 UTC" firstStartedPulling="2025-11-25 12:28:09.150939284 +0000 UTC m=+1209.069024675" lastFinishedPulling="2025-11-25 12:28:24.974310764 +0000 UTC m=+1224.892396145" observedRunningTime="2025-11-25 12:28:25.766690106 +0000 UTC m=+1225.684775487" watchObservedRunningTime="2025-11-25 12:28:25.778903802 +0000 UTC m=+1225.696989183" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.802570 4693 scope.go:117] "RemoveContainer" containerID="95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.813419 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5545f6bb6d-vdngh"] Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.822977 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5545f6bb6d-vdngh"] Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.832195 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.851652 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867162 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:28:25 crc kubenswrapper[4693]: E1125 12:28:25.867564 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a527ec6b-b211-43e1-afd2-6cfd2d60291a" containerName="glance-httpd" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867578 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a527ec6b-b211-43e1-afd2-6cfd2d60291a" containerName="glance-httpd" Nov 25 12:28:25 crc kubenswrapper[4693]: E1125 12:28:25.867590 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1342b7d-3a59-4236-9673-f0b377a5657d" containerName="barbican-api-log" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867597 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1342b7d-3a59-4236-9673-f0b377a5657d" containerName="barbican-api-log" Nov 25 12:28:25 crc kubenswrapper[4693]: E1125 12:28:25.867612 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6921c173-adf8-47d6-9e9b-98657a453bdd" containerName="neutron-httpd" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867618 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6921c173-adf8-47d6-9e9b-98657a453bdd" containerName="neutron-httpd" Nov 25 12:28:25 crc kubenswrapper[4693]: E1125 12:28:25.867641 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6921c173-adf8-47d6-9e9b-98657a453bdd" containerName="neutron-api" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867646 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6921c173-adf8-47d6-9e9b-98657a453bdd" containerName="neutron-api" Nov 25 12:28:25 crc kubenswrapper[4693]: E1125 12:28:25.867660 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a527ec6b-b211-43e1-afd2-6cfd2d60291a" containerName="glance-log" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867666 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a527ec6b-b211-43e1-afd2-6cfd2d60291a" containerName="glance-log" Nov 25 12:28:25 crc kubenswrapper[4693]: E1125 12:28:25.867676 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1342b7d-3a59-4236-9673-f0b377a5657d" containerName="barbican-api" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867681 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1342b7d-3a59-4236-9673-f0b377a5657d" containerName="barbican-api" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867830 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1342b7d-3a59-4236-9673-f0b377a5657d" containerName="barbican-api" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867841 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6921c173-adf8-47d6-9e9b-98657a453bdd" containerName="neutron-httpd" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867853 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6921c173-adf8-47d6-9e9b-98657a453bdd" containerName="neutron-api" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867865 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1342b7d-3a59-4236-9673-f0b377a5657d" containerName="barbican-api-log" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867880 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a527ec6b-b211-43e1-afd2-6cfd2d60291a" containerName="glance-log" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.867889 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a527ec6b-b211-43e1-afd2-6cfd2d60291a" containerName="glance-httpd" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.869045 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.875501 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.875596 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.894659 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.921402 4693 scope.go:117] "RemoveContainer" containerID="09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75" Nov 25 12:28:25 crc kubenswrapper[4693]: E1125 12:28:25.921890 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75\": container with ID starting with 09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75 not found: ID does not exist" containerID="09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.921918 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75"} err="failed to get container status \"09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75\": rpc error: code = NotFound desc = could not find container \"09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75\": container with ID starting with 09569a47aba72b7ecf5f4a55c2676277e795448c30e8d056b6ad2206a685cf75 not found: ID does not exist" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.921937 4693 scope.go:117] "RemoveContainer" containerID="95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f" Nov 25 12:28:25 crc kubenswrapper[4693]: E1125 12:28:25.922359 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f\": container with ID starting with 95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f not found: ID does not exist" containerID="95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.922418 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f"} err="failed to get container status \"95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f\": rpc error: code = NotFound desc = could not find container \"95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f\": container with ID starting with 95e845e060aaeec561b39524ec42ee4022cb9bae8990ec3d3f55e5a608fb586f not found: ID does not exist" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.922460 4693 scope.go:117] "RemoveContainer" containerID="bdb58e6825c07b11690e9e20ed28fd650708d53911c39bfd49c42098e1282447" Nov 25 12:28:25 crc kubenswrapper[4693]: I1125 12:28:25.944404 4693 scope.go:117] "RemoveContainer" containerID="a5e1cedfcb4eb196cf8cacbccc23aae6033bf7273ba1de2cf3d0818143c00a2c" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.078197 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2v7b\" (UniqueName: \"kubernetes.io/projected/bb2f0f2d-5d66-485f-a389-e07c52f143f2-kube-api-access-k2v7b\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.078303 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb2f0f2d-5d66-485f-a389-e07c52f143f2-logs\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.078359 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2f0f2d-5d66-485f-a389-e07c52f143f2-scripts\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.078406 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2f0f2d-5d66-485f-a389-e07c52f143f2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.078491 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2f0f2d-5d66-485f-a389-e07c52f143f2-config-data\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.078528 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb2f0f2d-5d66-485f-a389-e07c52f143f2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.078566 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2f0f2d-5d66-485f-a389-e07c52f143f2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.078842 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.180504 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2f0f2d-5d66-485f-a389-e07c52f143f2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.180589 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.180619 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2v7b\" (UniqueName: \"kubernetes.io/projected/bb2f0f2d-5d66-485f-a389-e07c52f143f2-kube-api-access-k2v7b\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.180668 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb2f0f2d-5d66-485f-a389-e07c52f143f2-logs\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.180728 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2f0f2d-5d66-485f-a389-e07c52f143f2-scripts\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.180751 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2f0f2d-5d66-485f-a389-e07c52f143f2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.180770 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2f0f2d-5d66-485f-a389-e07c52f143f2-config-data\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.180795 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb2f0f2d-5d66-485f-a389-e07c52f143f2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.180907 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.181208 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb2f0f2d-5d66-485f-a389-e07c52f143f2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.181667 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb2f0f2d-5d66-485f-a389-e07c52f143f2-logs\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.186685 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb2f0f2d-5d66-485f-a389-e07c52f143f2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.186845 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb2f0f2d-5d66-485f-a389-e07c52f143f2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.187746 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb2f0f2d-5d66-485f-a389-e07c52f143f2-scripts\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.188612 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb2f0f2d-5d66-485f-a389-e07c52f143f2-config-data\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.203722 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2v7b\" (UniqueName: \"kubernetes.io/projected/bb2f0f2d-5d66-485f-a389-e07c52f143f2-kube-api-access-k2v7b\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.211269 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"bb2f0f2d-5d66-485f-a389-e07c52f143f2\") " pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.223246 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.742511 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 25 12:28:26 crc kubenswrapper[4693]: W1125 12:28:26.750652 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb2f0f2d_5d66_485f_a389_e07c52f143f2.slice/crio-c391c228fb5a95b2c29c3a51b010b31aa83143954840f6c75f31c23d7756b218 WatchSource:0}: Error finding container c391c228fb5a95b2c29c3a51b010b31aa83143954840f6c75f31c23d7756b218: Status 404 returned error can't find the container with id c391c228fb5a95b2c29c3a51b010b31aa83143954840f6c75f31c23d7756b218 Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.783483 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bb2f0f2d-5d66-485f-a389-e07c52f143f2","Type":"ContainerStarted","Data":"c391c228fb5a95b2c29c3a51b010b31aa83143954840f6c75f31c23d7756b218"} Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.789596 4693 generic.go:334] "Generic (PLEG): container finished" podID="a3386396-6766-42b0-a683-af6f6c2da021" containerID="06de4c38e61999a12ce2fcb3bb3173374752e4183c7a3e7c975130c1e666ad25" exitCode=0 Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.789672 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a3386396-6766-42b0-a683-af6f6c2da021","Type":"ContainerDied","Data":"06de4c38e61999a12ce2fcb3bb3173374752e4183c7a3e7c975130c1e666ad25"} Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.792358 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60b0e117-e476-4ea8-b9e8-6cd21f6917a9","Type":"ContainerStarted","Data":"29319aebb06a13984ae4f7e9e34dfc6eff5bed336694da01bbc1b3075c566964"} Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.792612 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="ceilometer-central-agent" containerID="cri-o://1d5ab95e24871d2d267a12c36d7d1cb3a7dbfc0d6706950d9e23aace166d98ea" gracePeriod=30 Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.792636 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="proxy-httpd" containerID="cri-o://29319aebb06a13984ae4f7e9e34dfc6eff5bed336694da01bbc1b3075c566964" gracePeriod=30 Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.792668 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="sg-core" containerID="cri-o://ac9f65daac0a61ea1a80f60098a6d76f93fd6845c2ae9022577207dde3663cf4" gracePeriod=30 Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.792629 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.792751 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="ceilometer-notification-agent" containerID="cri-o://512b6bb91b98ceca522e017dbcd8c7c443da9a5f551bb0b9473195828175809a" gracePeriod=30 Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.829459 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6921c173-adf8-47d6-9e9b-98657a453bdd" path="/var/lib/kubelet/pods/6921c173-adf8-47d6-9e9b-98657a453bdd/volumes" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.830301 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a527ec6b-b211-43e1-afd2-6cfd2d60291a" path="/var/lib/kubelet/pods/a527ec6b-b211-43e1-afd2-6cfd2d60291a/volumes" Nov 25 12:28:26 crc kubenswrapper[4693]: I1125 12:28:26.846944 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.064348063 podStartE2EDuration="12.846912994s" podCreationTimestamp="2025-11-25 12:28:14 +0000 UTC" firstStartedPulling="2025-11-25 12:28:15.668288941 +0000 UTC m=+1215.586374322" lastFinishedPulling="2025-11-25 12:28:26.450853872 +0000 UTC m=+1226.368939253" observedRunningTime="2025-11-25 12:28:26.827228936 +0000 UTC m=+1226.745314337" watchObservedRunningTime="2025-11-25 12:28:26.846912994 +0000 UTC m=+1226.764998375" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.230593 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.235768 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-t8c6z"] Nov 25 12:28:27 crc kubenswrapper[4693]: E1125 12:28:27.236149 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3386396-6766-42b0-a683-af6f6c2da021" containerName="glance-log" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.236164 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3386396-6766-42b0-a683-af6f6c2da021" containerName="glance-log" Nov 25 12:28:27 crc kubenswrapper[4693]: E1125 12:28:27.236192 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3386396-6766-42b0-a683-af6f6c2da021" containerName="glance-httpd" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.236200 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3386396-6766-42b0-a683-af6f6c2da021" containerName="glance-httpd" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.236417 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3386396-6766-42b0-a683-af6f6c2da021" containerName="glance-httpd" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.236444 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3386396-6766-42b0-a683-af6f6c2da021" containerName="glance-log" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.237015 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t8c6z" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.256828 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-t8c6z"] Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.419225 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmntp\" (UniqueName: \"kubernetes.io/projected/a3386396-6766-42b0-a683-af6f6c2da021-kube-api-access-gmntp\") pod \"a3386396-6766-42b0-a683-af6f6c2da021\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.419275 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-config-data\") pod \"a3386396-6766-42b0-a683-af6f6c2da021\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.419295 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-internal-tls-certs\") pod \"a3386396-6766-42b0-a683-af6f6c2da021\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.419440 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3386396-6766-42b0-a683-af6f6c2da021-logs\") pod \"a3386396-6766-42b0-a683-af6f6c2da021\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.419471 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-scripts\") pod \"a3386396-6766-42b0-a683-af6f6c2da021\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.419500 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-combined-ca-bundle\") pod \"a3386396-6766-42b0-a683-af6f6c2da021\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.419566 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3386396-6766-42b0-a683-af6f6c2da021-httpd-run\") pod \"a3386396-6766-42b0-a683-af6f6c2da021\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.419611 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"a3386396-6766-42b0-a683-af6f6c2da021\" (UID: \"a3386396-6766-42b0-a683-af6f6c2da021\") " Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.419921 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76f22823-54fe-41b3-8918-1f9920948635-operator-scripts\") pod \"nova-api-db-create-t8c6z\" (UID: \"76f22823-54fe-41b3-8918-1f9920948635\") " pod="openstack/nova-api-db-create-t8c6z" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.420021 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wrmv\" (UniqueName: \"kubernetes.io/projected/76f22823-54fe-41b3-8918-1f9920948635-kube-api-access-6wrmv\") pod \"nova-api-db-create-t8c6z\" (UID: \"76f22823-54fe-41b3-8918-1f9920948635\") " pod="openstack/nova-api-db-create-t8c6z" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.422470 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3386396-6766-42b0-a683-af6f6c2da021-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a3386396-6766-42b0-a683-af6f6c2da021" (UID: "a3386396-6766-42b0-a683-af6f6c2da021"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.423897 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3386396-6766-42b0-a683-af6f6c2da021-logs" (OuterVolumeSpecName: "logs") pod "a3386396-6766-42b0-a683-af6f6c2da021" (UID: "a3386396-6766-42b0-a683-af6f6c2da021"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.426822 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-scripts" (OuterVolumeSpecName: "scripts") pod "a3386396-6766-42b0-a683-af6f6c2da021" (UID: "a3386396-6766-42b0-a683-af6f6c2da021"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.432457 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-f5pb7"] Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.434285 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3386396-6766-42b0-a683-af6f6c2da021-kube-api-access-gmntp" (OuterVolumeSpecName: "kube-api-access-gmntp") pod "a3386396-6766-42b0-a683-af6f6c2da021" (UID: "a3386396-6766-42b0-a683-af6f6c2da021"). InnerVolumeSpecName "kube-api-access-gmntp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.435089 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f5pb7" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.438686 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f5pb7"] Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.443271 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "a3386396-6766-42b0-a683-af6f6c2da021" (UID: "a3386396-6766-42b0-a683-af6f6c2da021"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.449042 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e58d-account-create-wxbnx"] Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.450910 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e58d-account-create-wxbnx" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.459054 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e58d-account-create-wxbnx"] Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.461755 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.488672 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3386396-6766-42b0-a683-af6f6c2da021" (UID: "a3386396-6766-42b0-a683-af6f6c2da021"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.495956 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-config-data" (OuterVolumeSpecName: "config-data") pod "a3386396-6766-42b0-a683-af6f6c2da021" (UID: "a3386396-6766-42b0-a683-af6f6c2da021"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.511266 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a3386396-6766-42b0-a683-af6f6c2da021" (UID: "a3386396-6766-42b0-a683-af6f6c2da021"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.523420 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76f22823-54fe-41b3-8918-1f9920948635-operator-scripts\") pod \"nova-api-db-create-t8c6z\" (UID: \"76f22823-54fe-41b3-8918-1f9920948635\") " pod="openstack/nova-api-db-create-t8c6z" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.523538 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wrmv\" (UniqueName: \"kubernetes.io/projected/76f22823-54fe-41b3-8918-1f9920948635-kube-api-access-6wrmv\") pod \"nova-api-db-create-t8c6z\" (UID: \"76f22823-54fe-41b3-8918-1f9920948635\") " pod="openstack/nova-api-db-create-t8c6z" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.523610 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3386396-6766-42b0-a683-af6f6c2da021-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.523624 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.523637 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.523649 4693 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a3386396-6766-42b0-a683-af6f6c2da021-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.523676 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.523688 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmntp\" (UniqueName: \"kubernetes.io/projected/a3386396-6766-42b0-a683-af6f6c2da021-kube-api-access-gmntp\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.523698 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.523708 4693 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3386396-6766-42b0-a683-af6f6c2da021-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.524234 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76f22823-54fe-41b3-8918-1f9920948635-operator-scripts\") pod \"nova-api-db-create-t8c6z\" (UID: \"76f22823-54fe-41b3-8918-1f9920948635\") " pod="openstack/nova-api-db-create-t8c6z" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.557139 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-k5vzm"] Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.558829 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k5vzm" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.563185 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wrmv\" (UniqueName: \"kubernetes.io/projected/76f22823-54fe-41b3-8918-1f9920948635-kube-api-access-6wrmv\") pod \"nova-api-db-create-t8c6z\" (UID: \"76f22823-54fe-41b3-8918-1f9920948635\") " pod="openstack/nova-api-db-create-t8c6z" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.564837 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5f13-account-create-s2vtk"] Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.566034 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5f13-account-create-s2vtk" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.568501 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.581033 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k5vzm"] Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.599812 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5f13-account-create-s2vtk"] Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.600086 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.606135 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t8c6z" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.626399 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxdw8\" (UniqueName: \"kubernetes.io/projected/75cab84e-e6f2-4b22-b67b-092223f7bc87-kube-api-access-vxdw8\") pod \"nova-api-e58d-account-create-wxbnx\" (UID: \"75cab84e-e6f2-4b22-b67b-092223f7bc87\") " pod="openstack/nova-api-e58d-account-create-wxbnx" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.626649 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63d1b45a-575e-42f4-b82a-cd8771e650e1-operator-scripts\") pod \"nova-cell0-db-create-f5pb7\" (UID: \"63d1b45a-575e-42f4-b82a-cd8771e650e1\") " pod="openstack/nova-cell0-db-create-f5pb7" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.626844 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nck4f\" (UniqueName: \"kubernetes.io/projected/63d1b45a-575e-42f4-b82a-cd8771e650e1-kube-api-access-nck4f\") pod \"nova-cell0-db-create-f5pb7\" (UID: \"63d1b45a-575e-42f4-b82a-cd8771e650e1\") " pod="openstack/nova-cell0-db-create-f5pb7" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.626995 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75cab84e-e6f2-4b22-b67b-092223f7bc87-operator-scripts\") pod \"nova-api-e58d-account-create-wxbnx\" (UID: \"75cab84e-e6f2-4b22-b67b-092223f7bc87\") " pod="openstack/nova-api-e58d-account-create-wxbnx" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.627212 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.738153 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79cs9\" (UniqueName: \"kubernetes.io/projected/5e34f200-34da-479a-a41c-f831d9e94220-kube-api-access-79cs9\") pod \"nova-cell1-db-create-k5vzm\" (UID: \"5e34f200-34da-479a-a41c-f831d9e94220\") " pod="openstack/nova-cell1-db-create-k5vzm" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.738752 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f09b87-30d2-48ef-8b6b-13c25206a68d-operator-scripts\") pod \"nova-cell0-5f13-account-create-s2vtk\" (UID: \"93f09b87-30d2-48ef-8b6b-13c25206a68d\") " pod="openstack/nova-cell0-5f13-account-create-s2vtk" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.738944 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxdw8\" (UniqueName: \"kubernetes.io/projected/75cab84e-e6f2-4b22-b67b-092223f7bc87-kube-api-access-vxdw8\") pod \"nova-api-e58d-account-create-wxbnx\" (UID: \"75cab84e-e6f2-4b22-b67b-092223f7bc87\") " pod="openstack/nova-api-e58d-account-create-wxbnx" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.739056 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63d1b45a-575e-42f4-b82a-cd8771e650e1-operator-scripts\") pod \"nova-cell0-db-create-f5pb7\" (UID: \"63d1b45a-575e-42f4-b82a-cd8771e650e1\") " pod="openstack/nova-cell0-db-create-f5pb7" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.739242 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6vgq\" (UniqueName: \"kubernetes.io/projected/93f09b87-30d2-48ef-8b6b-13c25206a68d-kube-api-access-p6vgq\") pod \"nova-cell0-5f13-account-create-s2vtk\" (UID: \"93f09b87-30d2-48ef-8b6b-13c25206a68d\") " pod="openstack/nova-cell0-5f13-account-create-s2vtk" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.739448 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nck4f\" (UniqueName: \"kubernetes.io/projected/63d1b45a-575e-42f4-b82a-cd8771e650e1-kube-api-access-nck4f\") pod \"nova-cell0-db-create-f5pb7\" (UID: \"63d1b45a-575e-42f4-b82a-cd8771e650e1\") " pod="openstack/nova-cell0-db-create-f5pb7" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.739607 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75cab84e-e6f2-4b22-b67b-092223f7bc87-operator-scripts\") pod \"nova-api-e58d-account-create-wxbnx\" (UID: \"75cab84e-e6f2-4b22-b67b-092223f7bc87\") " pod="openstack/nova-api-e58d-account-create-wxbnx" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.739753 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e34f200-34da-479a-a41c-f831d9e94220-operator-scripts\") pod \"nova-cell1-db-create-k5vzm\" (UID: \"5e34f200-34da-479a-a41c-f831d9e94220\") " pod="openstack/nova-cell1-db-create-k5vzm" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.754135 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63d1b45a-575e-42f4-b82a-cd8771e650e1-operator-scripts\") pod \"nova-cell0-db-create-f5pb7\" (UID: \"63d1b45a-575e-42f4-b82a-cd8771e650e1\") " pod="openstack/nova-cell0-db-create-f5pb7" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.755922 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75cab84e-e6f2-4b22-b67b-092223f7bc87-operator-scripts\") pod \"nova-api-e58d-account-create-wxbnx\" (UID: \"75cab84e-e6f2-4b22-b67b-092223f7bc87\") " pod="openstack/nova-api-e58d-account-create-wxbnx" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.764736 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7f2a-account-create-gfrxd"] Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.766511 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7f2a-account-create-gfrxd" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.771487 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.780785 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7f2a-account-create-gfrxd"] Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.781620 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nck4f\" (UniqueName: \"kubernetes.io/projected/63d1b45a-575e-42f4-b82a-cd8771e650e1-kube-api-access-nck4f\") pod \"nova-cell0-db-create-f5pb7\" (UID: \"63d1b45a-575e-42f4-b82a-cd8771e650e1\") " pod="openstack/nova-cell0-db-create-f5pb7" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.781971 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxdw8\" (UniqueName: \"kubernetes.io/projected/75cab84e-e6f2-4b22-b67b-092223f7bc87-kube-api-access-vxdw8\") pod \"nova-api-e58d-account-create-wxbnx\" (UID: \"75cab84e-e6f2-4b22-b67b-092223f7bc87\") " pod="openstack/nova-api-e58d-account-create-wxbnx" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.795270 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f5pb7" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.810854 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e58d-account-create-wxbnx" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.841566 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e34f200-34da-479a-a41c-f831d9e94220-operator-scripts\") pod \"nova-cell1-db-create-k5vzm\" (UID: \"5e34f200-34da-479a-a41c-f831d9e94220\") " pod="openstack/nova-cell1-db-create-k5vzm" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.841644 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79cs9\" (UniqueName: \"kubernetes.io/projected/5e34f200-34da-479a-a41c-f831d9e94220-kube-api-access-79cs9\") pod \"nova-cell1-db-create-k5vzm\" (UID: \"5e34f200-34da-479a-a41c-f831d9e94220\") " pod="openstack/nova-cell1-db-create-k5vzm" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.841679 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f09b87-30d2-48ef-8b6b-13c25206a68d-operator-scripts\") pod \"nova-cell0-5f13-account-create-s2vtk\" (UID: \"93f09b87-30d2-48ef-8b6b-13c25206a68d\") " pod="openstack/nova-cell0-5f13-account-create-s2vtk" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.841816 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6vgq\" (UniqueName: \"kubernetes.io/projected/93f09b87-30d2-48ef-8b6b-13c25206a68d-kube-api-access-p6vgq\") pod \"nova-cell0-5f13-account-create-s2vtk\" (UID: \"93f09b87-30d2-48ef-8b6b-13c25206a68d\") " pod="openstack/nova-cell0-5f13-account-create-s2vtk" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.843832 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f09b87-30d2-48ef-8b6b-13c25206a68d-operator-scripts\") pod \"nova-cell0-5f13-account-create-s2vtk\" (UID: \"93f09b87-30d2-48ef-8b6b-13c25206a68d\") " pod="openstack/nova-cell0-5f13-account-create-s2vtk" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.844069 4693 generic.go:334] "Generic (PLEG): container finished" podID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerID="29319aebb06a13984ae4f7e9e34dfc6eff5bed336694da01bbc1b3075c566964" exitCode=0 Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.844101 4693 generic.go:334] "Generic (PLEG): container finished" podID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerID="ac9f65daac0a61ea1a80f60098a6d76f93fd6845c2ae9022577207dde3663cf4" exitCode=2 Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.844145 4693 generic.go:334] "Generic (PLEG): container finished" podID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerID="512b6bb91b98ceca522e017dbcd8c7c443da9a5f551bb0b9473195828175809a" exitCode=0 Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.844153 4693 generic.go:334] "Generic (PLEG): container finished" podID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerID="1d5ab95e24871d2d267a12c36d7d1cb3a7dbfc0d6706950d9e23aace166d98ea" exitCode=0 Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.844546 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60b0e117-e476-4ea8-b9e8-6cd21f6917a9","Type":"ContainerDied","Data":"29319aebb06a13984ae4f7e9e34dfc6eff5bed336694da01bbc1b3075c566964"} Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.844606 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e34f200-34da-479a-a41c-f831d9e94220-operator-scripts\") pod \"nova-cell1-db-create-k5vzm\" (UID: \"5e34f200-34da-479a-a41c-f831d9e94220\") " pod="openstack/nova-cell1-db-create-k5vzm" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.844623 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60b0e117-e476-4ea8-b9e8-6cd21f6917a9","Type":"ContainerDied","Data":"ac9f65daac0a61ea1a80f60098a6d76f93fd6845c2ae9022577207dde3663cf4"} Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.844637 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60b0e117-e476-4ea8-b9e8-6cd21f6917a9","Type":"ContainerDied","Data":"512b6bb91b98ceca522e017dbcd8c7c443da9a5f551bb0b9473195828175809a"} Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.844668 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60b0e117-e476-4ea8-b9e8-6cd21f6917a9","Type":"ContainerDied","Data":"1d5ab95e24871d2d267a12c36d7d1cb3a7dbfc0d6706950d9e23aace166d98ea"} Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.858389 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79cs9\" (UniqueName: \"kubernetes.io/projected/5e34f200-34da-479a-a41c-f831d9e94220-kube-api-access-79cs9\") pod \"nova-cell1-db-create-k5vzm\" (UID: \"5e34f200-34da-479a-a41c-f831d9e94220\") " pod="openstack/nova-cell1-db-create-k5vzm" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.862904 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6vgq\" (UniqueName: \"kubernetes.io/projected/93f09b87-30d2-48ef-8b6b-13c25206a68d-kube-api-access-p6vgq\") pod \"nova-cell0-5f13-account-create-s2vtk\" (UID: \"93f09b87-30d2-48ef-8b6b-13c25206a68d\") " pod="openstack/nova-cell0-5f13-account-create-s2vtk" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.869214 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bb2f0f2d-5d66-485f-a389-e07c52f143f2","Type":"ContainerStarted","Data":"f6211bdaf0addd44f42e256fcbaccbd9e144cdbe91d36c86956542ef65d1b0ff"} Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.879844 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a3386396-6766-42b0-a683-af6f6c2da021","Type":"ContainerDied","Data":"11456f80e3966d77698a18e5e794768531126c46a88cecfed810e831392344ae"} Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.879902 4693 scope.go:117] "RemoveContainer" containerID="06de4c38e61999a12ce2fcb3bb3173374752e4183c7a3e7c975130c1e666ad25" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.880096 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.912801 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k5vzm" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.918722 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5f13-account-create-s2vtk" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.946695 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e055aa-e90a-4be0-a519-a9b30151eaa3-operator-scripts\") pod \"nova-cell1-7f2a-account-create-gfrxd\" (UID: \"a5e055aa-e90a-4be0-a519-a9b30151eaa3\") " pod="openstack/nova-cell1-7f2a-account-create-gfrxd" Nov 25 12:28:27 crc kubenswrapper[4693]: I1125 12:28:27.946898 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt5vs\" (UniqueName: \"kubernetes.io/projected/a5e055aa-e90a-4be0-a519-a9b30151eaa3-kube-api-access-rt5vs\") pod \"nova-cell1-7f2a-account-create-gfrxd\" (UID: \"a5e055aa-e90a-4be0-a519-a9b30151eaa3\") " pod="openstack/nova-cell1-7f2a-account-create-gfrxd" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.048714 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt5vs\" (UniqueName: \"kubernetes.io/projected/a5e055aa-e90a-4be0-a519-a9b30151eaa3-kube-api-access-rt5vs\") pod \"nova-cell1-7f2a-account-create-gfrxd\" (UID: \"a5e055aa-e90a-4be0-a519-a9b30151eaa3\") " pod="openstack/nova-cell1-7f2a-account-create-gfrxd" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.048795 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e055aa-e90a-4be0-a519-a9b30151eaa3-operator-scripts\") pod \"nova-cell1-7f2a-account-create-gfrxd\" (UID: \"a5e055aa-e90a-4be0-a519-a9b30151eaa3\") " pod="openstack/nova-cell1-7f2a-account-create-gfrxd" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.049732 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e055aa-e90a-4be0-a519-a9b30151eaa3-operator-scripts\") pod \"nova-cell1-7f2a-account-create-gfrxd\" (UID: \"a5e055aa-e90a-4be0-a519-a9b30151eaa3\") " pod="openstack/nova-cell1-7f2a-account-create-gfrxd" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.067977 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt5vs\" (UniqueName: \"kubernetes.io/projected/a5e055aa-e90a-4be0-a519-a9b30151eaa3-kube-api-access-rt5vs\") pod \"nova-cell1-7f2a-account-create-gfrxd\" (UID: \"a5e055aa-e90a-4be0-a519-a9b30151eaa3\") " pod="openstack/nova-cell1-7f2a-account-create-gfrxd" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.098058 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7f2a-account-create-gfrxd" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.135675 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.156518 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.178707 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.211943 4693 scope.go:117] "RemoveContainer" containerID="b4f5419c71b94019c4a7731f744eac9712e45c8cde2a446e4cffbfba4d3efbbb" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.217028 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:28:28 crc kubenswrapper[4693]: E1125 12:28:28.217387 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="ceilometer-central-agent" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.217398 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="ceilometer-central-agent" Nov 25 12:28:28 crc kubenswrapper[4693]: E1125 12:28:28.217424 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="proxy-httpd" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.217430 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="proxy-httpd" Nov 25 12:28:28 crc kubenswrapper[4693]: E1125 12:28:28.217450 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="ceilometer-notification-agent" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.217458 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="ceilometer-notification-agent" Nov 25 12:28:28 crc kubenswrapper[4693]: E1125 12:28:28.217473 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="sg-core" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.217479 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="sg-core" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.217655 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="ceilometer-central-agent" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.217670 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="ceilometer-notification-agent" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.217688 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="proxy-httpd" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.217704 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" containerName="sg-core" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.218541 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.221814 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.222013 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.260946 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-log-httpd\") pod \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.260984 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-scripts\") pod \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.261027 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-combined-ca-bundle\") pod \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.261062 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-config-data\") pod \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.261097 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-run-httpd\") pod \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.261138 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-sg-core-conf-yaml\") pod \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.261234 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sktt4\" (UniqueName: \"kubernetes.io/projected/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-kube-api-access-sktt4\") pod \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\" (UID: \"60b0e117-e476-4ea8-b9e8-6cd21f6917a9\") " Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.262568 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "60b0e117-e476-4ea8-b9e8-6cd21f6917a9" (UID: "60b0e117-e476-4ea8-b9e8-6cd21f6917a9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.263018 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "60b0e117-e476-4ea8-b9e8-6cd21f6917a9" (UID: "60b0e117-e476-4ea8-b9e8-6cd21f6917a9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.263469 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.263481 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.271365 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.271900 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-scripts" (OuterVolumeSpecName: "scripts") pod "60b0e117-e476-4ea8-b9e8-6cd21f6917a9" (UID: "60b0e117-e476-4ea8-b9e8-6cd21f6917a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.280396 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-kube-api-access-sktt4" (OuterVolumeSpecName: "kube-api-access-sktt4") pod "60b0e117-e476-4ea8-b9e8-6cd21f6917a9" (UID: "60b0e117-e476-4ea8-b9e8-6cd21f6917a9"). InnerVolumeSpecName "kube-api-access-sktt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.296290 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-t8c6z"] Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.303443 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "60b0e117-e476-4ea8-b9e8-6cd21f6917a9" (UID: "60b0e117-e476-4ea8-b9e8-6cd21f6917a9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:28 crc kubenswrapper[4693]: W1125 12:28:28.306576 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76f22823_54fe_41b3_8918_1f9920948635.slice/crio-42b620645643f20e602e73492348dc7d4c27db0d9a0bad04e30a8b5bac7873b7 WatchSource:0}: Error finding container 42b620645643f20e602e73492348dc7d4c27db0d9a0bad04e30a8b5bac7873b7: Status 404 returned error can't find the container with id 42b620645643f20e602e73492348dc7d4c27db0d9a0bad04e30a8b5bac7873b7 Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.364660 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-logs\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.364725 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.364748 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.364804 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.364826 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.364862 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.364907 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nz2v\" (UniqueName: \"kubernetes.io/projected/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-kube-api-access-5nz2v\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.364926 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.364991 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.365001 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.365011 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sktt4\" (UniqueName: \"kubernetes.io/projected/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-kube-api-access-sktt4\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.422070 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60b0e117-e476-4ea8-b9e8-6cd21f6917a9" (UID: "60b0e117-e476-4ea8-b9e8-6cd21f6917a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.427185 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e58d-account-create-wxbnx"] Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.453510 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-config-data" (OuterVolumeSpecName: "config-data") pod "60b0e117-e476-4ea8-b9e8-6cd21f6917a9" (UID: "60b0e117-e476-4ea8-b9e8-6cd21f6917a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.467876 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.467933 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.468064 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.468108 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.468168 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.468243 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nz2v\" (UniqueName: \"kubernetes.io/projected/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-kube-api-access-5nz2v\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.468274 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.468323 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-logs\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.468399 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.468410 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b0e117-e476-4ea8-b9e8-6cd21f6917a9-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.468788 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-logs\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.469155 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.470519 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.474998 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.475275 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.487140 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.488511 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f5pb7"] Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.488894 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.490518 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nz2v\" (UniqueName: \"kubernetes.io/projected/711404f8-4ff3-44b1-b4f5-dfdc70ac930f-kube-api-access-5nz2v\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: W1125 12:28:28.511990 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63d1b45a_575e_42f4_b82a_cd8771e650e1.slice/crio-0ad1e0363ef6a87d5318241ed8331c337f128e95f3a7ccce4bcc9100a635abe4 WatchSource:0}: Error finding container 0ad1e0363ef6a87d5318241ed8331c337f128e95f3a7ccce4bcc9100a635abe4: Status 404 returned error can't find the container with id 0ad1e0363ef6a87d5318241ed8331c337f128e95f3a7ccce4bcc9100a635abe4 Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.638251 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"711404f8-4ff3-44b1-b4f5-dfdc70ac930f\") " pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.713245 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5f13-account-create-s2vtk"] Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.768598 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-k5vzm"] Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.840729 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3386396-6766-42b0-a683-af6f6c2da021" path="/var/lib/kubelet/pods/a3386396-6766-42b0-a683-af6f6c2da021/volumes" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.852710 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.877325 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7f2a-account-create-gfrxd"] Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.903809 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5f13-account-create-s2vtk" event={"ID":"93f09b87-30d2-48ef-8b6b-13c25206a68d","Type":"ContainerStarted","Data":"579c921824eff0507237ae110c6606cdfe7921176dda145c76fc5003fd249b32"} Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.913754 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f5pb7" event={"ID":"63d1b45a-575e-42f4-b82a-cd8771e650e1","Type":"ContainerStarted","Data":"bdac18db2389d15ad40ef97c53b99d8c43a79125d4358735adc97afe440a32dc"} Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.913839 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f5pb7" event={"ID":"63d1b45a-575e-42f4-b82a-cd8771e650e1","Type":"ContainerStarted","Data":"0ad1e0363ef6a87d5318241ed8331c337f128e95f3a7ccce4bcc9100a635abe4"} Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.916336 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e58d-account-create-wxbnx" event={"ID":"75cab84e-e6f2-4b22-b67b-092223f7bc87","Type":"ContainerStarted","Data":"f1ea9325b99eeacb6a7f039e3e6dc0dae978ebf7fd1d76b721a21451ad440e6f"} Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.916414 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e58d-account-create-wxbnx" event={"ID":"75cab84e-e6f2-4b22-b67b-092223f7bc87","Type":"ContainerStarted","Data":"e714df243746832dfa35de75918cb32365d5fa346295a76f291542ae972cf294"} Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.931585 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-f5pb7" podStartSLOduration=1.9315660000000001 podStartE2EDuration="1.931566s" podCreationTimestamp="2025-11-25 12:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:28:28.930724426 +0000 UTC m=+1228.848809797" watchObservedRunningTime="2025-11-25 12:28:28.931566 +0000 UTC m=+1228.849651371" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.940596 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t8c6z" event={"ID":"76f22823-54fe-41b3-8918-1f9920948635","Type":"ContainerStarted","Data":"391ad313f567f214ba07a3a03bba6df3dbfa996993de563ca58116a3db0968c9"} Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.940637 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t8c6z" event={"ID":"76f22823-54fe-41b3-8918-1f9920948635","Type":"ContainerStarted","Data":"42b620645643f20e602e73492348dc7d4c27db0d9a0bad04e30a8b5bac7873b7"} Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.947068 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k5vzm" event={"ID":"5e34f200-34da-479a-a41c-f831d9e94220","Type":"ContainerStarted","Data":"2ad2ff4d23450b7a87dafd9f361d7871ce2f85c329532c9d468efed37c025ce3"} Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.955989 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-e58d-account-create-wxbnx" podStartSLOduration=1.955966601 podStartE2EDuration="1.955966601s" podCreationTimestamp="2025-11-25 12:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:28:28.946357599 +0000 UTC m=+1228.864443000" watchObservedRunningTime="2025-11-25 12:28:28.955966601 +0000 UTC m=+1228.874051982" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.988871 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60b0e117-e476-4ea8-b9e8-6cd21f6917a9","Type":"ContainerDied","Data":"931243106c248910fa18441a9a5d6af283b0e04986e9b34908211b8c726690d1"} Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.988918 4693 scope.go:117] "RemoveContainer" containerID="29319aebb06a13984ae4f7e9e34dfc6eff5bed336694da01bbc1b3075c566964" Nov 25 12:28:28 crc kubenswrapper[4693]: I1125 12:28:28.989051 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.162485 4693 scope.go:117] "RemoveContainer" containerID="ac9f65daac0a61ea1a80f60098a6d76f93fd6845c2ae9022577207dde3663cf4" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.186447 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.203280 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.218852 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.221927 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.225176 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.225417 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.229720 4693 scope.go:117] "RemoveContainer" containerID="512b6bb91b98ceca522e017dbcd8c7c443da9a5f551bb0b9473195828175809a" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.230367 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.264463 4693 scope.go:117] "RemoveContainer" containerID="1d5ab95e24871d2d267a12c36d7d1cb3a7dbfc0d6706950d9e23aace166d98ea" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.393102 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p589s\" (UniqueName: \"kubernetes.io/projected/f53837f9-be48-4bb1-82ab-b77aefe87355-kube-api-access-p589s\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.393183 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53837f9-be48-4bb1-82ab-b77aefe87355-run-httpd\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.393212 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53837f9-be48-4bb1-82ab-b77aefe87355-log-httpd\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.393288 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.393343 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-scripts\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.393424 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-config-data\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.393643 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.496644 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p589s\" (UniqueName: \"kubernetes.io/projected/f53837f9-be48-4bb1-82ab-b77aefe87355-kube-api-access-p589s\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.496738 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53837f9-be48-4bb1-82ab-b77aefe87355-run-httpd\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.496764 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53837f9-be48-4bb1-82ab-b77aefe87355-log-httpd\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.496813 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.496869 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-scripts\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.496931 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-config-data\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.496954 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.501345 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53837f9-be48-4bb1-82ab-b77aefe87355-run-httpd\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.501684 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53837f9-be48-4bb1-82ab-b77aefe87355-log-httpd\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.504795 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.507184 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 25 12:28:29 crc kubenswrapper[4693]: W1125 12:28:29.513295 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod711404f8_4ff3_44b1_b4f5_dfdc70ac930f.slice/crio-961a9d353fab1cdf22b4349b896f290cab1c258201cbce910091d63fd14b441c WatchSource:0}: Error finding container 961a9d353fab1cdf22b4349b896f290cab1c258201cbce910091d63fd14b441c: Status 404 returned error can't find the container with id 961a9d353fab1cdf22b4349b896f290cab1c258201cbce910091d63fd14b441c Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.515147 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-scripts\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.533429 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-config-data\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.534952 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p589s\" (UniqueName: \"kubernetes.io/projected/f53837f9-be48-4bb1-82ab-b77aefe87355-kube-api-access-p589s\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.536943 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " pod="openstack/ceilometer-0" Nov 25 12:28:29 crc kubenswrapper[4693]: I1125 12:28:29.554894 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.001797 4693 generic.go:334] "Generic (PLEG): container finished" podID="a5e055aa-e90a-4be0-a519-a9b30151eaa3" containerID="e0c68fe472628b7a794d21897e8e4c8227fdf442c53de6c71d4e21eff02a3c4f" exitCode=0 Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.001863 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7f2a-account-create-gfrxd" event={"ID":"a5e055aa-e90a-4be0-a519-a9b30151eaa3","Type":"ContainerDied","Data":"e0c68fe472628b7a794d21897e8e4c8227fdf442c53de6c71d4e21eff02a3c4f"} Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.002436 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7f2a-account-create-gfrxd" event={"ID":"a5e055aa-e90a-4be0-a519-a9b30151eaa3","Type":"ContainerStarted","Data":"a9a22929820a5fb25451eea591c989cbe7b0faf63d7afd8d3f5b069455945e4a"} Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.022509 4693 generic.go:334] "Generic (PLEG): container finished" podID="5e34f200-34da-479a-a41c-f831d9e94220" containerID="641d4049a9d40f0c4affb2418df3cae80ff72148b4f727aaf6f6ebef7779c0fd" exitCode=0 Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.022625 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k5vzm" event={"ID":"5e34f200-34da-479a-a41c-f831d9e94220","Type":"ContainerDied","Data":"641d4049a9d40f0c4affb2418df3cae80ff72148b4f727aaf6f6ebef7779c0fd"} Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.039175 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bb2f0f2d-5d66-485f-a389-e07c52f143f2","Type":"ContainerStarted","Data":"f74f5e09e3474b9a6df7cc5a6a2e82686db4fda5bc9bf23537cf97eff463f333"} Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.047075 4693 generic.go:334] "Generic (PLEG): container finished" podID="93f09b87-30d2-48ef-8b6b-13c25206a68d" containerID="6c754fe65bd363484ffe743c05170e5a95e072704c87acdb666925d53eee2b22" exitCode=0 Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.047139 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5f13-account-create-s2vtk" event={"ID":"93f09b87-30d2-48ef-8b6b-13c25206a68d","Type":"ContainerDied","Data":"6c754fe65bd363484ffe743c05170e5a95e072704c87acdb666925d53eee2b22"} Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.048137 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"711404f8-4ff3-44b1-b4f5-dfdc70ac930f","Type":"ContainerStarted","Data":"961a9d353fab1cdf22b4349b896f290cab1c258201cbce910091d63fd14b441c"} Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.049137 4693 generic.go:334] "Generic (PLEG): container finished" podID="63d1b45a-575e-42f4-b82a-cd8771e650e1" containerID="bdac18db2389d15ad40ef97c53b99d8c43a79125d4358735adc97afe440a32dc" exitCode=0 Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.049170 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f5pb7" event={"ID":"63d1b45a-575e-42f4-b82a-cd8771e650e1","Type":"ContainerDied","Data":"bdac18db2389d15ad40ef97c53b99d8c43a79125d4358735adc97afe440a32dc"} Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.050147 4693 generic.go:334] "Generic (PLEG): container finished" podID="75cab84e-e6f2-4b22-b67b-092223f7bc87" containerID="f1ea9325b99eeacb6a7f039e3e6dc0dae978ebf7fd1d76b721a21451ad440e6f" exitCode=0 Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.050187 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e58d-account-create-wxbnx" event={"ID":"75cab84e-e6f2-4b22-b67b-092223f7bc87","Type":"ContainerDied","Data":"f1ea9325b99eeacb6a7f039e3e6dc0dae978ebf7fd1d76b721a21451ad440e6f"} Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.052306 4693 generic.go:334] "Generic (PLEG): container finished" podID="76f22823-54fe-41b3-8918-1f9920948635" containerID="391ad313f567f214ba07a3a03bba6df3dbfa996993de563ca58116a3db0968c9" exitCode=0 Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.052513 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t8c6z" event={"ID":"76f22823-54fe-41b3-8918-1f9920948635","Type":"ContainerDied","Data":"391ad313f567f214ba07a3a03bba6df3dbfa996993de563ca58116a3db0968c9"} Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.071409 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.071365456 podStartE2EDuration="5.071365456s" podCreationTimestamp="2025-11-25 12:28:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:28:30.06234679 +0000 UTC m=+1229.980432171" watchObservedRunningTime="2025-11-25 12:28:30.071365456 +0000 UTC m=+1229.989450837" Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.092289 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.497423 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t8c6z" Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.636755 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76f22823-54fe-41b3-8918-1f9920948635-operator-scripts\") pod \"76f22823-54fe-41b3-8918-1f9920948635\" (UID: \"76f22823-54fe-41b3-8918-1f9920948635\") " Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.636838 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wrmv\" (UniqueName: \"kubernetes.io/projected/76f22823-54fe-41b3-8918-1f9920948635-kube-api-access-6wrmv\") pod \"76f22823-54fe-41b3-8918-1f9920948635\" (UID: \"76f22823-54fe-41b3-8918-1f9920948635\") " Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.638363 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76f22823-54fe-41b3-8918-1f9920948635-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76f22823-54fe-41b3-8918-1f9920948635" (UID: "76f22823-54fe-41b3-8918-1f9920948635"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.651656 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76f22823-54fe-41b3-8918-1f9920948635-kube-api-access-6wrmv" (OuterVolumeSpecName: "kube-api-access-6wrmv") pod "76f22823-54fe-41b3-8918-1f9920948635" (UID: "76f22823-54fe-41b3-8918-1f9920948635"). InnerVolumeSpecName "kube-api-access-6wrmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.741247 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76f22823-54fe-41b3-8918-1f9920948635-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.741578 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wrmv\" (UniqueName: \"kubernetes.io/projected/76f22823-54fe-41b3-8918-1f9920948635-kube-api-access-6wrmv\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:30 crc kubenswrapper[4693]: I1125 12:28:30.865258 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b0e117-e476-4ea8-b9e8-6cd21f6917a9" path="/var/lib/kubelet/pods/60b0e117-e476-4ea8-b9e8-6cd21f6917a9/volumes" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.063455 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t8c6z" event={"ID":"76f22823-54fe-41b3-8918-1f9920948635","Type":"ContainerDied","Data":"42b620645643f20e602e73492348dc7d4c27db0d9a0bad04e30a8b5bac7873b7"} Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.063509 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42b620645643f20e602e73492348dc7d4c27db0d9a0bad04e30a8b5bac7873b7" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.063507 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t8c6z" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.065907 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"711404f8-4ff3-44b1-b4f5-dfdc70ac930f","Type":"ContainerStarted","Data":"c8d950a52a997500051d5d0735c00f30f9e397c2c1014066ec32fe009ad0d544"} Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.065952 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"711404f8-4ff3-44b1-b4f5-dfdc70ac930f","Type":"ContainerStarted","Data":"d01b522858aed2e5e48252a1d0f1b8724cd747e50ac4cab4ce3b2fb6988f4320"} Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.067962 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53837f9-be48-4bb1-82ab-b77aefe87355","Type":"ContainerStarted","Data":"a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a"} Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.068005 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53837f9-be48-4bb1-82ab-b77aefe87355","Type":"ContainerStarted","Data":"bd560a19f4c2ccd6569f205c990be862c2231edd48057a803474c09e48ae3f62"} Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.092899 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.092879779 podStartE2EDuration="3.092879779s" podCreationTimestamp="2025-11-25 12:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:28:31.082934648 +0000 UTC m=+1231.001020029" watchObservedRunningTime="2025-11-25 12:28:31.092879779 +0000 UTC m=+1231.010965160" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.473453 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7f2a-account-create-gfrxd" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.560437 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt5vs\" (UniqueName: \"kubernetes.io/projected/a5e055aa-e90a-4be0-a519-a9b30151eaa3-kube-api-access-rt5vs\") pod \"a5e055aa-e90a-4be0-a519-a9b30151eaa3\" (UID: \"a5e055aa-e90a-4be0-a519-a9b30151eaa3\") " Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.560543 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e055aa-e90a-4be0-a519-a9b30151eaa3-operator-scripts\") pod \"a5e055aa-e90a-4be0-a519-a9b30151eaa3\" (UID: \"a5e055aa-e90a-4be0-a519-a9b30151eaa3\") " Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.562168 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e055aa-e90a-4be0-a519-a9b30151eaa3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5e055aa-e90a-4be0-a519-a9b30151eaa3" (UID: "a5e055aa-e90a-4be0-a519-a9b30151eaa3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.562413 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e055aa-e90a-4be0-a519-a9b30151eaa3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.584264 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e055aa-e90a-4be0-a519-a9b30151eaa3-kube-api-access-rt5vs" (OuterVolumeSpecName: "kube-api-access-rt5vs") pod "a5e055aa-e90a-4be0-a519-a9b30151eaa3" (UID: "a5e055aa-e90a-4be0-a519-a9b30151eaa3"). InnerVolumeSpecName "kube-api-access-rt5vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.663941 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt5vs\" (UniqueName: \"kubernetes.io/projected/a5e055aa-e90a-4be0-a519-a9b30151eaa3-kube-api-access-rt5vs\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.736341 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f5pb7" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.749516 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k5vzm" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.781797 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5f13-account-create-s2vtk" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.797341 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e58d-account-create-wxbnx" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.873935 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxdw8\" (UniqueName: \"kubernetes.io/projected/75cab84e-e6f2-4b22-b67b-092223f7bc87-kube-api-access-vxdw8\") pod \"75cab84e-e6f2-4b22-b67b-092223f7bc87\" (UID: \"75cab84e-e6f2-4b22-b67b-092223f7bc87\") " Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.874033 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e34f200-34da-479a-a41c-f831d9e94220-operator-scripts\") pod \"5e34f200-34da-479a-a41c-f831d9e94220\" (UID: \"5e34f200-34da-479a-a41c-f831d9e94220\") " Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.874084 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63d1b45a-575e-42f4-b82a-cd8771e650e1-operator-scripts\") pod \"63d1b45a-575e-42f4-b82a-cd8771e650e1\" (UID: \"63d1b45a-575e-42f4-b82a-cd8771e650e1\") " Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.874197 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f09b87-30d2-48ef-8b6b-13c25206a68d-operator-scripts\") pod \"93f09b87-30d2-48ef-8b6b-13c25206a68d\" (UID: \"93f09b87-30d2-48ef-8b6b-13c25206a68d\") " Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.874246 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nck4f\" (UniqueName: \"kubernetes.io/projected/63d1b45a-575e-42f4-b82a-cd8771e650e1-kube-api-access-nck4f\") pod \"63d1b45a-575e-42f4-b82a-cd8771e650e1\" (UID: \"63d1b45a-575e-42f4-b82a-cd8771e650e1\") " Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.874284 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75cab84e-e6f2-4b22-b67b-092223f7bc87-operator-scripts\") pod \"75cab84e-e6f2-4b22-b67b-092223f7bc87\" (UID: \"75cab84e-e6f2-4b22-b67b-092223f7bc87\") " Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.874322 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6vgq\" (UniqueName: \"kubernetes.io/projected/93f09b87-30d2-48ef-8b6b-13c25206a68d-kube-api-access-p6vgq\") pod \"93f09b87-30d2-48ef-8b6b-13c25206a68d\" (UID: \"93f09b87-30d2-48ef-8b6b-13c25206a68d\") " Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.874423 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79cs9\" (UniqueName: \"kubernetes.io/projected/5e34f200-34da-479a-a41c-f831d9e94220-kube-api-access-79cs9\") pod \"5e34f200-34da-479a-a41c-f831d9e94220\" (UID: \"5e34f200-34da-479a-a41c-f831d9e94220\") " Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.874661 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e34f200-34da-479a-a41c-f831d9e94220-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e34f200-34da-479a-a41c-f831d9e94220" (UID: "5e34f200-34da-479a-a41c-f831d9e94220"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.874908 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e34f200-34da-479a-a41c-f831d9e94220-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.875933 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75cab84e-e6f2-4b22-b67b-092223f7bc87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75cab84e-e6f2-4b22-b67b-092223f7bc87" (UID: "75cab84e-e6f2-4b22-b67b-092223f7bc87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.876711 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63d1b45a-575e-42f4-b82a-cd8771e650e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63d1b45a-575e-42f4-b82a-cd8771e650e1" (UID: "63d1b45a-575e-42f4-b82a-cd8771e650e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.876766 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f09b87-30d2-48ef-8b6b-13c25206a68d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93f09b87-30d2-48ef-8b6b-13c25206a68d" (UID: "93f09b87-30d2-48ef-8b6b-13c25206a68d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.880424 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e34f200-34da-479a-a41c-f831d9e94220-kube-api-access-79cs9" (OuterVolumeSpecName: "kube-api-access-79cs9") pod "5e34f200-34da-479a-a41c-f831d9e94220" (UID: "5e34f200-34da-479a-a41c-f831d9e94220"). InnerVolumeSpecName "kube-api-access-79cs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.891633 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d1b45a-575e-42f4-b82a-cd8771e650e1-kube-api-access-nck4f" (OuterVolumeSpecName: "kube-api-access-nck4f") pod "63d1b45a-575e-42f4-b82a-cd8771e650e1" (UID: "63d1b45a-575e-42f4-b82a-cd8771e650e1"). InnerVolumeSpecName "kube-api-access-nck4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.891710 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75cab84e-e6f2-4b22-b67b-092223f7bc87-kube-api-access-vxdw8" (OuterVolumeSpecName: "kube-api-access-vxdw8") pod "75cab84e-e6f2-4b22-b67b-092223f7bc87" (UID: "75cab84e-e6f2-4b22-b67b-092223f7bc87"). InnerVolumeSpecName "kube-api-access-vxdw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.898666 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f09b87-30d2-48ef-8b6b-13c25206a68d-kube-api-access-p6vgq" (OuterVolumeSpecName: "kube-api-access-p6vgq") pod "93f09b87-30d2-48ef-8b6b-13c25206a68d" (UID: "93f09b87-30d2-48ef-8b6b-13c25206a68d"). InnerVolumeSpecName "kube-api-access-p6vgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.976753 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79cs9\" (UniqueName: \"kubernetes.io/projected/5e34f200-34da-479a-a41c-f831d9e94220-kube-api-access-79cs9\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.976813 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxdw8\" (UniqueName: \"kubernetes.io/projected/75cab84e-e6f2-4b22-b67b-092223f7bc87-kube-api-access-vxdw8\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.976836 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63d1b45a-575e-42f4-b82a-cd8771e650e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.976875 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f09b87-30d2-48ef-8b6b-13c25206a68d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.976886 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nck4f\" (UniqueName: \"kubernetes.io/projected/63d1b45a-575e-42f4-b82a-cd8771e650e1-kube-api-access-nck4f\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.976896 4693 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75cab84e-e6f2-4b22-b67b-092223f7bc87-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:31 crc kubenswrapper[4693]: I1125 12:28:31.976908 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6vgq\" (UniqueName: \"kubernetes.io/projected/93f09b87-30d2-48ef-8b6b-13c25206a68d-kube-api-access-p6vgq\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.079044 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5f13-account-create-s2vtk" event={"ID":"93f09b87-30d2-48ef-8b6b-13c25206a68d","Type":"ContainerDied","Data":"579c921824eff0507237ae110c6606cdfe7921176dda145c76fc5003fd249b32"} Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.079417 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="579c921824eff0507237ae110c6606cdfe7921176dda145c76fc5003fd249b32" Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.079055 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5f13-account-create-s2vtk" Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.081060 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f5pb7" event={"ID":"63d1b45a-575e-42f4-b82a-cd8771e650e1","Type":"ContainerDied","Data":"0ad1e0363ef6a87d5318241ed8331c337f128e95f3a7ccce4bcc9100a635abe4"} Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.081087 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ad1e0363ef6a87d5318241ed8331c337f128e95f3a7ccce4bcc9100a635abe4" Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.081134 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f5pb7" Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.086792 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e58d-account-create-wxbnx" event={"ID":"75cab84e-e6f2-4b22-b67b-092223f7bc87","Type":"ContainerDied","Data":"e714df243746832dfa35de75918cb32365d5fa346295a76f291542ae972cf294"} Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.086845 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e714df243746832dfa35de75918cb32365d5fa346295a76f291542ae972cf294" Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.086814 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e58d-account-create-wxbnx" Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.088788 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7f2a-account-create-gfrxd" event={"ID":"a5e055aa-e90a-4be0-a519-a9b30151eaa3","Type":"ContainerDied","Data":"a9a22929820a5fb25451eea591c989cbe7b0faf63d7afd8d3f5b069455945e4a"} Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.088820 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7f2a-account-create-gfrxd" Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.088823 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a22929820a5fb25451eea591c989cbe7b0faf63d7afd8d3f5b069455945e4a" Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.091051 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-k5vzm" event={"ID":"5e34f200-34da-479a-a41c-f831d9e94220","Type":"ContainerDied","Data":"2ad2ff4d23450b7a87dafd9f361d7871ce2f85c329532c9d468efed37c025ce3"} Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.091078 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ad2ff4d23450b7a87dafd9f361d7871ce2f85c329532c9d468efed37c025ce3" Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.091103 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-k5vzm" Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.100982 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53837f9-be48-4bb1-82ab-b77aefe87355","Type":"ContainerStarted","Data":"bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af"} Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.565589 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bbcbd4584-78jln" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Nov 25 12:28:32 crc kubenswrapper[4693]: I1125 12:28:32.565725 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:28:33 crc kubenswrapper[4693]: I1125 12:28:33.113026 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53837f9-be48-4bb1-82ab-b77aefe87355","Type":"ContainerStarted","Data":"6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0"} Nov 25 12:28:34 crc kubenswrapper[4693]: I1125 12:28:34.125657 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53837f9-be48-4bb1-82ab-b77aefe87355","Type":"ContainerStarted","Data":"92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662"} Nov 25 12:28:34 crc kubenswrapper[4693]: I1125 12:28:34.125989 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 12:28:34 crc kubenswrapper[4693]: I1125 12:28:34.152106 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.605325324 podStartE2EDuration="5.152087511s" podCreationTimestamp="2025-11-25 12:28:29 +0000 UTC" firstStartedPulling="2025-11-25 12:28:30.102420785 +0000 UTC m=+1230.020506166" lastFinishedPulling="2025-11-25 12:28:33.649182952 +0000 UTC m=+1233.567268353" observedRunningTime="2025-11-25 12:28:34.144813425 +0000 UTC m=+1234.062898816" watchObservedRunningTime="2025-11-25 12:28:34.152087511 +0000 UTC m=+1234.070172892" Nov 25 12:28:35 crc kubenswrapper[4693]: I1125 12:28:35.114137 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:28:35 crc kubenswrapper[4693]: I1125 12:28:35.114200 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:28:35 crc kubenswrapper[4693]: I1125 12:28:35.114271 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:28:35 crc kubenswrapper[4693]: I1125 12:28:35.115062 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"245f737f203c8007cd386e41d5f986e5bdb4a5f145f31a6ec9ef66e36fb73a9f"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 12:28:35 crc kubenswrapper[4693]: I1125 12:28:35.115126 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://245f737f203c8007cd386e41d5f986e5bdb4a5f145f31a6ec9ef66e36fb73a9f" gracePeriod=600 Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.145965 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="245f737f203c8007cd386e41d5f986e5bdb4a5f145f31a6ec9ef66e36fb73a9f" exitCode=0 Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.146027 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"245f737f203c8007cd386e41d5f986e5bdb4a5f145f31a6ec9ef66e36fb73a9f"} Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.146566 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"ce4776c622bc7e46d7d568ae624b5c3426e9dfd4bd443fa89113683ec10d405f"} Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.146591 4693 scope.go:117] "RemoveContainer" containerID="f1602027df59cd76a649d636d394ab648e039f6efe47c91bfe119cadecb3b352" Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.225012 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.225071 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.253493 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.266983 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.878954 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.880471 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="ceilometer-central-agent" containerID="cri-o://a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a" gracePeriod=30 Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.880538 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="sg-core" containerID="cri-o://6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0" gracePeriod=30 Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.880563 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="proxy-httpd" containerID="cri-o://92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662" gracePeriod=30 Nov 25 12:28:36 crc kubenswrapper[4693]: I1125 12:28:36.880747 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="ceilometer-notification-agent" containerID="cri-o://bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af" gracePeriod=30 Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.157735 4693 generic.go:334] "Generic (PLEG): container finished" podID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerID="92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662" exitCode=0 Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.157774 4693 generic.go:334] "Generic (PLEG): container finished" podID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerID="6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0" exitCode=2 Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.157833 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53837f9-be48-4bb1-82ab-b77aefe87355","Type":"ContainerDied","Data":"92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662"} Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.157880 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53837f9-be48-4bb1-82ab-b77aefe87355","Type":"ContainerDied","Data":"6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0"} Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.161203 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.161437 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.781562 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mwsdc"] Nov 25 12:28:37 crc kubenswrapper[4693]: E1125 12:28:37.782458 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e055aa-e90a-4be0-a519-a9b30151eaa3" containerName="mariadb-account-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.782485 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e055aa-e90a-4be0-a519-a9b30151eaa3" containerName="mariadb-account-create" Nov 25 12:28:37 crc kubenswrapper[4693]: E1125 12:28:37.782508 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e34f200-34da-479a-a41c-f831d9e94220" containerName="mariadb-database-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.782516 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e34f200-34da-479a-a41c-f831d9e94220" containerName="mariadb-database-create" Nov 25 12:28:37 crc kubenswrapper[4693]: E1125 12:28:37.782528 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f22823-54fe-41b3-8918-1f9920948635" containerName="mariadb-database-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.782536 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f22823-54fe-41b3-8918-1f9920948635" containerName="mariadb-database-create" Nov 25 12:28:37 crc kubenswrapper[4693]: E1125 12:28:37.782547 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d1b45a-575e-42f4-b82a-cd8771e650e1" containerName="mariadb-database-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.782555 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d1b45a-575e-42f4-b82a-cd8771e650e1" containerName="mariadb-database-create" Nov 25 12:28:37 crc kubenswrapper[4693]: E1125 12:28:37.782583 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75cab84e-e6f2-4b22-b67b-092223f7bc87" containerName="mariadb-account-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.782591 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="75cab84e-e6f2-4b22-b67b-092223f7bc87" containerName="mariadb-account-create" Nov 25 12:28:37 crc kubenswrapper[4693]: E1125 12:28:37.782606 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f09b87-30d2-48ef-8b6b-13c25206a68d" containerName="mariadb-account-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.782616 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f09b87-30d2-48ef-8b6b-13c25206a68d" containerName="mariadb-account-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.782841 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="76f22823-54fe-41b3-8918-1f9920948635" containerName="mariadb-database-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.782855 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="75cab84e-e6f2-4b22-b67b-092223f7bc87" containerName="mariadb-account-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.782880 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f09b87-30d2-48ef-8b6b-13c25206a68d" containerName="mariadb-account-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.782898 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e055aa-e90a-4be0-a519-a9b30151eaa3" containerName="mariadb-account-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.782933 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d1b45a-575e-42f4-b82a-cd8771e650e1" containerName="mariadb-database-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.782949 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e34f200-34da-479a-a41c-f831d9e94220" containerName="mariadb-database-create" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.783729 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.790170 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.790814 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.813021 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j4vmt" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.834069 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mwsdc"] Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.906722 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-config-data\") pod \"nova-cell0-conductor-db-sync-mwsdc\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.907536 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnmpz\" (UniqueName: \"kubernetes.io/projected/49348558-30c7-450b-978b-0be5ea427c08-kube-api-access-bnmpz\") pod \"nova-cell0-conductor-db-sync-mwsdc\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.907831 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mwsdc\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:37 crc kubenswrapper[4693]: I1125 12:28:37.908019 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-scripts\") pod \"nova-cell0-conductor-db-sync-mwsdc\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.010432 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-config-data\") pod \"nova-cell0-conductor-db-sync-mwsdc\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.010866 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnmpz\" (UniqueName: \"kubernetes.io/projected/49348558-30c7-450b-978b-0be5ea427c08-kube-api-access-bnmpz\") pod \"nova-cell0-conductor-db-sync-mwsdc\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.010971 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mwsdc\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.011052 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-scripts\") pod \"nova-cell0-conductor-db-sync-mwsdc\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.021251 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mwsdc\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.026023 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-config-data\") pod \"nova-cell0-conductor-db-sync-mwsdc\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.030869 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-scripts\") pod \"nova-cell0-conductor-db-sync-mwsdc\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.035996 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnmpz\" (UniqueName: \"kubernetes.io/projected/49348558-30c7-450b-978b-0be5ea427c08-kube-api-access-bnmpz\") pod \"nova-cell0-conductor-db-sync-mwsdc\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.112772 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.178519 4693 generic.go:334] "Generic (PLEG): container finished" podID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerID="bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af" exitCode=0 Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.178579 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53837f9-be48-4bb1-82ab-b77aefe87355","Type":"ContainerDied","Data":"bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af"} Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.675326 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.685056 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mwsdc"] Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.725292 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-horizon-tls-certs\") pod \"1f60abf7-3c23-4174-9150-50061c054cf5\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.725388 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-horizon-secret-key\") pod \"1f60abf7-3c23-4174-9150-50061c054cf5\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.725419 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-885nr\" (UniqueName: \"kubernetes.io/projected/1f60abf7-3c23-4174-9150-50061c054cf5-kube-api-access-885nr\") pod \"1f60abf7-3c23-4174-9150-50061c054cf5\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.725455 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f60abf7-3c23-4174-9150-50061c054cf5-logs\") pod \"1f60abf7-3c23-4174-9150-50061c054cf5\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.725590 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f60abf7-3c23-4174-9150-50061c054cf5-config-data\") pod \"1f60abf7-3c23-4174-9150-50061c054cf5\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.725634 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-combined-ca-bundle\") pod \"1f60abf7-3c23-4174-9150-50061c054cf5\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.725658 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f60abf7-3c23-4174-9150-50061c054cf5-scripts\") pod \"1f60abf7-3c23-4174-9150-50061c054cf5\" (UID: \"1f60abf7-3c23-4174-9150-50061c054cf5\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.726235 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f60abf7-3c23-4174-9150-50061c054cf5-logs" (OuterVolumeSpecName: "logs") pod "1f60abf7-3c23-4174-9150-50061c054cf5" (UID: "1f60abf7-3c23-4174-9150-50061c054cf5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.740001 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1f60abf7-3c23-4174-9150-50061c054cf5" (UID: "1f60abf7-3c23-4174-9150-50061c054cf5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.740005 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f60abf7-3c23-4174-9150-50061c054cf5-kube-api-access-885nr" (OuterVolumeSpecName: "kube-api-access-885nr") pod "1f60abf7-3c23-4174-9150-50061c054cf5" (UID: "1f60abf7-3c23-4174-9150-50061c054cf5"). InnerVolumeSpecName "kube-api-access-885nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.758583 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f60abf7-3c23-4174-9150-50061c054cf5" (UID: "1f60abf7-3c23-4174-9150-50061c054cf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.766326 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f60abf7-3c23-4174-9150-50061c054cf5-config-data" (OuterVolumeSpecName: "config-data") pod "1f60abf7-3c23-4174-9150-50061c054cf5" (UID: "1f60abf7-3c23-4174-9150-50061c054cf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.783957 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "1f60abf7-3c23-4174-9150-50061c054cf5" (UID: "1f60abf7-3c23-4174-9150-50061c054cf5"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.794610 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f60abf7-3c23-4174-9150-50061c054cf5-scripts" (OuterVolumeSpecName: "scripts") pod "1f60abf7-3c23-4174-9150-50061c054cf5" (UID: "1f60abf7-3c23-4174-9150-50061c054cf5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.807212 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.827666 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f60abf7-3c23-4174-9150-50061c054cf5-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.827694 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1f60abf7-3c23-4174-9150-50061c054cf5-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.827704 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.827713 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f60abf7-3c23-4174-9150-50061c054cf5-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.827721 4693 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.827729 4693 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1f60abf7-3c23-4174-9150-50061c054cf5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.827738 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-885nr\" (UniqueName: \"kubernetes.io/projected/1f60abf7-3c23-4174-9150-50061c054cf5-kube-api-access-885nr\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.854291 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.854343 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.896094 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.914034 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.929325 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-sg-core-conf-yaml\") pod \"f53837f9-be48-4bb1-82ab-b77aefe87355\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.929386 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53837f9-be48-4bb1-82ab-b77aefe87355-run-httpd\") pod \"f53837f9-be48-4bb1-82ab-b77aefe87355\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.929441 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-combined-ca-bundle\") pod \"f53837f9-be48-4bb1-82ab-b77aefe87355\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.929467 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53837f9-be48-4bb1-82ab-b77aefe87355-log-httpd\") pod \"f53837f9-be48-4bb1-82ab-b77aefe87355\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.929496 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p589s\" (UniqueName: \"kubernetes.io/projected/f53837f9-be48-4bb1-82ab-b77aefe87355-kube-api-access-p589s\") pod \"f53837f9-be48-4bb1-82ab-b77aefe87355\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.929548 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-scripts\") pod \"f53837f9-be48-4bb1-82ab-b77aefe87355\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.929587 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-config-data\") pod \"f53837f9-be48-4bb1-82ab-b77aefe87355\" (UID: \"f53837f9-be48-4bb1-82ab-b77aefe87355\") " Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.930067 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f53837f9-be48-4bb1-82ab-b77aefe87355-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f53837f9-be48-4bb1-82ab-b77aefe87355" (UID: "f53837f9-be48-4bb1-82ab-b77aefe87355"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.930212 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53837f9-be48-4bb1-82ab-b77aefe87355-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.930394 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f53837f9-be48-4bb1-82ab-b77aefe87355-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f53837f9-be48-4bb1-82ab-b77aefe87355" (UID: "f53837f9-be48-4bb1-82ab-b77aefe87355"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.935655 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53837f9-be48-4bb1-82ab-b77aefe87355-kube-api-access-p589s" (OuterVolumeSpecName: "kube-api-access-p589s") pod "f53837f9-be48-4bb1-82ab-b77aefe87355" (UID: "f53837f9-be48-4bb1-82ab-b77aefe87355"). InnerVolumeSpecName "kube-api-access-p589s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.935910 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-scripts" (OuterVolumeSpecName: "scripts") pod "f53837f9-be48-4bb1-82ab-b77aefe87355" (UID: "f53837f9-be48-4bb1-82ab-b77aefe87355"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:38 crc kubenswrapper[4693]: I1125 12:28:38.965014 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f53837f9-be48-4bb1-82ab-b77aefe87355" (UID: "f53837f9-be48-4bb1-82ab-b77aefe87355"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.015834 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f53837f9-be48-4bb1-82ab-b77aefe87355" (UID: "f53837f9-be48-4bb1-82ab-b77aefe87355"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.032335 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.032395 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f53837f9-be48-4bb1-82ab-b77aefe87355-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.032409 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.032422 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p589s\" (UniqueName: \"kubernetes.io/projected/f53837f9-be48-4bb1-82ab-b77aefe87355-kube-api-access-p589s\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.032435 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.054275 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-config-data" (OuterVolumeSpecName: "config-data") pod "f53837f9-be48-4bb1-82ab-b77aefe87355" (UID: "f53837f9-be48-4bb1-82ab-b77aefe87355"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.133777 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53837f9-be48-4bb1-82ab-b77aefe87355-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.191239 4693 generic.go:334] "Generic (PLEG): container finished" podID="1f60abf7-3c23-4174-9150-50061c054cf5" containerID="5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38" exitCode=137 Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.191301 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbcbd4584-78jln" event={"ID":"1f60abf7-3c23-4174-9150-50061c054cf5","Type":"ContainerDied","Data":"5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38"} Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.191400 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbcbd4584-78jln" event={"ID":"1f60abf7-3c23-4174-9150-50061c054cf5","Type":"ContainerDied","Data":"5fbd3f1b8bdb7e8b933ea91a128eb10c755273c96443365bb0cf920e3a71e90d"} Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.191419 4693 scope.go:117] "RemoveContainer" containerID="d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.191505 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbcbd4584-78jln" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.193614 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mwsdc" event={"ID":"49348558-30c7-450b-978b-0be5ea427c08","Type":"ContainerStarted","Data":"2fd050966a2cb0d36393506ca22cfcf8707652c8e929ce8c8869ddb126cf863e"} Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.202051 4693 generic.go:334] "Generic (PLEG): container finished" podID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerID="a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a" exitCode=0 Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.202197 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.202243 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53837f9-be48-4bb1-82ab-b77aefe87355","Type":"ContainerDied","Data":"a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a"} Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.202278 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f53837f9-be48-4bb1-82ab-b77aefe87355","Type":"ContainerDied","Data":"bd560a19f4c2ccd6569f205c990be862c2231edd48057a803474c09e48ae3f62"} Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.202666 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.202816 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.219721 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bbcbd4584-78jln"] Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.228815 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7bbcbd4584-78jln"] Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.258041 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.271610 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.289219 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:39 crc kubenswrapper[4693]: E1125 12:28:39.289715 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" containerName="horizon-log" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.289740 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" containerName="horizon-log" Nov 25 12:28:39 crc kubenswrapper[4693]: E1125 12:28:39.289758 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" containerName="horizon" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.289765 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" containerName="horizon" Nov 25 12:28:39 crc kubenswrapper[4693]: E1125 12:28:39.289782 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="ceilometer-central-agent" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.289791 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="ceilometer-central-agent" Nov 25 12:28:39 crc kubenswrapper[4693]: E1125 12:28:39.289808 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="sg-core" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.289816 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="sg-core" Nov 25 12:28:39 crc kubenswrapper[4693]: E1125 12:28:39.289836 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="ceilometer-notification-agent" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.289843 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="ceilometer-notification-agent" Nov 25 12:28:39 crc kubenswrapper[4693]: E1125 12:28:39.289860 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="proxy-httpd" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.289867 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="proxy-httpd" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.290077 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="proxy-httpd" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.290101 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" containerName="horizon-log" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.290117 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" containerName="horizon" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.290132 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="ceilometer-notification-agent" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.290148 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="ceilometer-central-agent" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.290157 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" containerName="sg-core" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.292219 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.296251 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.296619 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.307734 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.346345 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.346460 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01b8bcbf-77a4-466c-8bf1-c320a4688751-run-httpd\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.346534 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.346588 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsjkh\" (UniqueName: \"kubernetes.io/projected/01b8bcbf-77a4-466c-8bf1-c320a4688751-kube-api-access-bsjkh\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.346609 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-scripts\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.346635 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01b8bcbf-77a4-466c-8bf1-c320a4688751-log-httpd\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.346662 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-config-data\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.409610 4693 scope.go:117] "RemoveContainer" containerID="5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.430663 4693 scope.go:117] "RemoveContainer" containerID="d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f" Nov 25 12:28:39 crc kubenswrapper[4693]: E1125 12:28:39.431222 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f\": container with ID starting with d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f not found: ID does not exist" containerID="d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.431264 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f"} err="failed to get container status \"d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f\": rpc error: code = NotFound desc = could not find container \"d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f\": container with ID starting with d1bec51af78d06c8fa245aa09d8f2dd8999318e935b63804806a462831ef142f not found: ID does not exist" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.431293 4693 scope.go:117] "RemoveContainer" containerID="5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38" Nov 25 12:28:39 crc kubenswrapper[4693]: E1125 12:28:39.431610 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38\": container with ID starting with 5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38 not found: ID does not exist" containerID="5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.431640 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38"} err="failed to get container status \"5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38\": rpc error: code = NotFound desc = could not find container \"5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38\": container with ID starting with 5966af34bc21d139601ee6e07c6aa929a3b2e237e8fc9bfe288c111940583e38 not found: ID does not exist" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.431657 4693 scope.go:117] "RemoveContainer" containerID="92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.449318 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsjkh\" (UniqueName: \"kubernetes.io/projected/01b8bcbf-77a4-466c-8bf1-c320a4688751-kube-api-access-bsjkh\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.449381 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-scripts\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.449433 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01b8bcbf-77a4-466c-8bf1-c320a4688751-log-httpd\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.449473 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-config-data\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.449667 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.449696 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01b8bcbf-77a4-466c-8bf1-c320a4688751-run-httpd\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.449833 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.451333 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01b8bcbf-77a4-466c-8bf1-c320a4688751-log-httpd\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.452728 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01b8bcbf-77a4-466c-8bf1-c320a4688751-run-httpd\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.465606 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-config-data\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.467184 4693 scope.go:117] "RemoveContainer" containerID="6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.467464 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-scripts\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.470085 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.474178 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsjkh\" (UniqueName: \"kubernetes.io/projected/01b8bcbf-77a4-466c-8bf1-c320a4688751-kube-api-access-bsjkh\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.477864 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.604541 4693 scope.go:117] "RemoveContainer" containerID="bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.617192 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.632139 4693 scope.go:117] "RemoveContainer" containerID="a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.685407 4693 scope.go:117] "RemoveContainer" containerID="92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662" Nov 25 12:28:39 crc kubenswrapper[4693]: E1125 12:28:39.685853 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662\": container with ID starting with 92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662 not found: ID does not exist" containerID="92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.685894 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662"} err="failed to get container status \"92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662\": rpc error: code = NotFound desc = could not find container \"92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662\": container with ID starting with 92daf228e4eac344e8ba88fb1096a8c6a8507bf0718b190aa3ee12816ec43662 not found: ID does not exist" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.685935 4693 scope.go:117] "RemoveContainer" containerID="6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0" Nov 25 12:28:39 crc kubenswrapper[4693]: E1125 12:28:39.686179 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0\": container with ID starting with 6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0 not found: ID does not exist" containerID="6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.686205 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0"} err="failed to get container status \"6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0\": rpc error: code = NotFound desc = could not find container \"6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0\": container with ID starting with 6d990b59d164a3fa599cc32455eb28559bc8239ddf2f16c224012a4b2b6a4bc0 not found: ID does not exist" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.686219 4693 scope.go:117] "RemoveContainer" containerID="bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af" Nov 25 12:28:39 crc kubenswrapper[4693]: E1125 12:28:39.686540 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af\": container with ID starting with bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af not found: ID does not exist" containerID="bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.686561 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af"} err="failed to get container status \"bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af\": rpc error: code = NotFound desc = could not find container \"bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af\": container with ID starting with bcb78c29f54584a0892b2a48eedc9ac04a09fbb8cb1f1e310fa8f9e7288c45af not found: ID does not exist" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.686573 4693 scope.go:117] "RemoveContainer" containerID="a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a" Nov 25 12:28:39 crc kubenswrapper[4693]: E1125 12:28:39.687559 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a\": container with ID starting with a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a not found: ID does not exist" containerID="a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.687583 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a"} err="failed to get container status \"a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a\": rpc error: code = NotFound desc = could not find container \"a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a\": container with ID starting with a6651182af4388122951b1a840c3af8bbbec45f0e24695049bbb90353a1a7c4a not found: ID does not exist" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.855946 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.856362 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 12:28:39 crc kubenswrapper[4693]: I1125 12:28:39.941468 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 25 12:28:40 crc kubenswrapper[4693]: I1125 12:28:40.202816 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:40 crc kubenswrapper[4693]: W1125 12:28:40.214206 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01b8bcbf_77a4_466c_8bf1_c320a4688751.slice/crio-1be8e7109e2025eec165e3c9113d9001dad7978a131eadfa8ef21f4a26118405 WatchSource:0}: Error finding container 1be8e7109e2025eec165e3c9113d9001dad7978a131eadfa8ef21f4a26118405: Status 404 returned error can't find the container with id 1be8e7109e2025eec165e3c9113d9001dad7978a131eadfa8ef21f4a26118405 Nov 25 12:28:40 crc kubenswrapper[4693]: I1125 12:28:40.834038 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f60abf7-3c23-4174-9150-50061c054cf5" path="/var/lib/kubelet/pods/1f60abf7-3c23-4174-9150-50061c054cf5/volumes" Nov 25 12:28:40 crc kubenswrapper[4693]: I1125 12:28:40.834925 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53837f9-be48-4bb1-82ab-b77aefe87355" path="/var/lib/kubelet/pods/f53837f9-be48-4bb1-82ab-b77aefe87355/volumes" Nov 25 12:28:41 crc kubenswrapper[4693]: I1125 12:28:41.230245 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01b8bcbf-77a4-466c-8bf1-c320a4688751","Type":"ContainerStarted","Data":"56f77ea6a467fc6a485602ac51ff24e78354d38fa68b6ccbbf965e0722649cdf"} Nov 25 12:28:41 crc kubenswrapper[4693]: I1125 12:28:41.231152 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01b8bcbf-77a4-466c-8bf1-c320a4688751","Type":"ContainerStarted","Data":"1be8e7109e2025eec165e3c9113d9001dad7978a131eadfa8ef21f4a26118405"} Nov 25 12:28:41 crc kubenswrapper[4693]: I1125 12:28:41.544232 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:41 crc kubenswrapper[4693]: I1125 12:28:41.544317 4693 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 25 12:28:41 crc kubenswrapper[4693]: I1125 12:28:41.674256 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 25 12:28:42 crc kubenswrapper[4693]: I1125 12:28:42.251826 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01b8bcbf-77a4-466c-8bf1-c320a4688751","Type":"ContainerStarted","Data":"585d9913cf8c6ced88b4e7219c95f02c87e0d6483f1f486a21591c87558289ed"} Nov 25 12:28:43 crc kubenswrapper[4693]: I1125 12:28:43.265448 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01b8bcbf-77a4-466c-8bf1-c320a4688751","Type":"ContainerStarted","Data":"d6d3ba9a306b5bbc79bc207e344574203fa48faadb2a04c619ce99ebd74a8f23"} Nov 25 12:28:44 crc kubenswrapper[4693]: I1125 12:28:44.626295 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:50 crc kubenswrapper[4693]: I1125 12:28:50.329464 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mwsdc" event={"ID":"49348558-30c7-450b-978b-0be5ea427c08","Type":"ContainerStarted","Data":"87bc75025aae6c2df3afa013ecdd94fc884d4fca980e665e3058c7afd5fac2f8"} Nov 25 12:28:51 crc kubenswrapper[4693]: I1125 12:28:51.348152 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="ceilometer-central-agent" containerID="cri-o://56f77ea6a467fc6a485602ac51ff24e78354d38fa68b6ccbbf965e0722649cdf" gracePeriod=30 Nov 25 12:28:51 crc kubenswrapper[4693]: I1125 12:28:51.348806 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01b8bcbf-77a4-466c-8bf1-c320a4688751","Type":"ContainerStarted","Data":"1d445022f99729d7f3d390e58ee942f5680caa0e8f8336440e5d6344c6513c52"} Nov 25 12:28:51 crc kubenswrapper[4693]: I1125 12:28:51.348875 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 12:28:51 crc kubenswrapper[4693]: I1125 12:28:51.348927 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="proxy-httpd" containerID="cri-o://1d445022f99729d7f3d390e58ee942f5680caa0e8f8336440e5d6344c6513c52" gracePeriod=30 Nov 25 12:28:51 crc kubenswrapper[4693]: I1125 12:28:51.349025 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="sg-core" containerID="cri-o://d6d3ba9a306b5bbc79bc207e344574203fa48faadb2a04c619ce99ebd74a8f23" gracePeriod=30 Nov 25 12:28:51 crc kubenswrapper[4693]: I1125 12:28:51.349103 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="ceilometer-notification-agent" containerID="cri-o://585d9913cf8c6ced88b4e7219c95f02c87e0d6483f1f486a21591c87558289ed" gracePeriod=30 Nov 25 12:28:51 crc kubenswrapper[4693]: I1125 12:28:51.393899 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mwsdc" podStartSLOduration=3.238172669 podStartE2EDuration="14.393879832s" podCreationTimestamp="2025-11-25 12:28:37 +0000 UTC" firstStartedPulling="2025-11-25 12:28:38.695549439 +0000 UTC m=+1238.613634820" lastFinishedPulling="2025-11-25 12:28:49.851256602 +0000 UTC m=+1249.769341983" observedRunningTime="2025-11-25 12:28:51.368091811 +0000 UTC m=+1251.286177212" watchObservedRunningTime="2025-11-25 12:28:51.393879832 +0000 UTC m=+1251.311965223" Nov 25 12:28:51 crc kubenswrapper[4693]: I1125 12:28:51.397130 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.764386064 podStartE2EDuration="12.397111264s" podCreationTimestamp="2025-11-25 12:28:39 +0000 UTC" firstStartedPulling="2025-11-25 12:28:40.218907563 +0000 UTC m=+1240.136992944" lastFinishedPulling="2025-11-25 12:28:49.851632763 +0000 UTC m=+1249.769718144" observedRunningTime="2025-11-25 12:28:51.387507321 +0000 UTC m=+1251.305592732" watchObservedRunningTime="2025-11-25 12:28:51.397111264 +0000 UTC m=+1251.315196665" Nov 25 12:28:52 crc kubenswrapper[4693]: I1125 12:28:52.364840 4693 generic.go:334] "Generic (PLEG): container finished" podID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerID="1d445022f99729d7f3d390e58ee942f5680caa0e8f8336440e5d6344c6513c52" exitCode=0 Nov 25 12:28:52 crc kubenswrapper[4693]: I1125 12:28:52.365198 4693 generic.go:334] "Generic (PLEG): container finished" podID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerID="d6d3ba9a306b5bbc79bc207e344574203fa48faadb2a04c619ce99ebd74a8f23" exitCode=2 Nov 25 12:28:52 crc kubenswrapper[4693]: I1125 12:28:52.364952 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01b8bcbf-77a4-466c-8bf1-c320a4688751","Type":"ContainerDied","Data":"1d445022f99729d7f3d390e58ee942f5680caa0e8f8336440e5d6344c6513c52"} Nov 25 12:28:52 crc kubenswrapper[4693]: I1125 12:28:52.365246 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01b8bcbf-77a4-466c-8bf1-c320a4688751","Type":"ContainerDied","Data":"d6d3ba9a306b5bbc79bc207e344574203fa48faadb2a04c619ce99ebd74a8f23"} Nov 25 12:28:52 crc kubenswrapper[4693]: E1125 12:28:52.633667 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01b8bcbf_77a4_466c_8bf1_c320a4688751.slice/crio-conmon-585d9913cf8c6ced88b4e7219c95f02c87e0d6483f1f486a21591c87558289ed.scope\": RecentStats: unable to find data in memory cache]" Nov 25 12:28:53 crc kubenswrapper[4693]: I1125 12:28:53.376620 4693 generic.go:334] "Generic (PLEG): container finished" podID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerID="585d9913cf8c6ced88b4e7219c95f02c87e0d6483f1f486a21591c87558289ed" exitCode=0 Nov 25 12:28:53 crc kubenswrapper[4693]: I1125 12:28:53.376938 4693 generic.go:334] "Generic (PLEG): container finished" podID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerID="56f77ea6a467fc6a485602ac51ff24e78354d38fa68b6ccbbf965e0722649cdf" exitCode=0 Nov 25 12:28:53 crc kubenswrapper[4693]: I1125 12:28:53.376693 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01b8bcbf-77a4-466c-8bf1-c320a4688751","Type":"ContainerDied","Data":"585d9913cf8c6ced88b4e7219c95f02c87e0d6483f1f486a21591c87558289ed"} Nov 25 12:28:53 crc kubenswrapper[4693]: I1125 12:28:53.376968 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01b8bcbf-77a4-466c-8bf1-c320a4688751","Type":"ContainerDied","Data":"56f77ea6a467fc6a485602ac51ff24e78354d38fa68b6ccbbf965e0722649cdf"} Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.510066 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.644706 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01b8bcbf-77a4-466c-8bf1-c320a4688751-run-httpd\") pod \"01b8bcbf-77a4-466c-8bf1-c320a4688751\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.644804 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-scripts\") pod \"01b8bcbf-77a4-466c-8bf1-c320a4688751\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.644903 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-sg-core-conf-yaml\") pod \"01b8bcbf-77a4-466c-8bf1-c320a4688751\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.644927 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01b8bcbf-77a4-466c-8bf1-c320a4688751-log-httpd\") pod \"01b8bcbf-77a4-466c-8bf1-c320a4688751\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.645046 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsjkh\" (UniqueName: \"kubernetes.io/projected/01b8bcbf-77a4-466c-8bf1-c320a4688751-kube-api-access-bsjkh\") pod \"01b8bcbf-77a4-466c-8bf1-c320a4688751\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.645137 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-config-data\") pod \"01b8bcbf-77a4-466c-8bf1-c320a4688751\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.645219 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b8bcbf-77a4-466c-8bf1-c320a4688751-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "01b8bcbf-77a4-466c-8bf1-c320a4688751" (UID: "01b8bcbf-77a4-466c-8bf1-c320a4688751"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.645232 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-combined-ca-bundle\") pod \"01b8bcbf-77a4-466c-8bf1-c320a4688751\" (UID: \"01b8bcbf-77a4-466c-8bf1-c320a4688751\") " Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.645450 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b8bcbf-77a4-466c-8bf1-c320a4688751-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "01b8bcbf-77a4-466c-8bf1-c320a4688751" (UID: "01b8bcbf-77a4-466c-8bf1-c320a4688751"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.645824 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01b8bcbf-77a4-466c-8bf1-c320a4688751-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.645851 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01b8bcbf-77a4-466c-8bf1-c320a4688751-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.651542 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-scripts" (OuterVolumeSpecName: "scripts") pod "01b8bcbf-77a4-466c-8bf1-c320a4688751" (UID: "01b8bcbf-77a4-466c-8bf1-c320a4688751"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.653467 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b8bcbf-77a4-466c-8bf1-c320a4688751-kube-api-access-bsjkh" (OuterVolumeSpecName: "kube-api-access-bsjkh") pod "01b8bcbf-77a4-466c-8bf1-c320a4688751" (UID: "01b8bcbf-77a4-466c-8bf1-c320a4688751"). InnerVolumeSpecName "kube-api-access-bsjkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.672025 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "01b8bcbf-77a4-466c-8bf1-c320a4688751" (UID: "01b8bcbf-77a4-466c-8bf1-c320a4688751"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.716108 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01b8bcbf-77a4-466c-8bf1-c320a4688751" (UID: "01b8bcbf-77a4-466c-8bf1-c320a4688751"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.748043 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsjkh\" (UniqueName: \"kubernetes.io/projected/01b8bcbf-77a4-466c-8bf1-c320a4688751-kube-api-access-bsjkh\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.748082 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.748096 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.748111 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.751902 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-config-data" (OuterVolumeSpecName: "config-data") pod "01b8bcbf-77a4-466c-8bf1-c320a4688751" (UID: "01b8bcbf-77a4-466c-8bf1-c320a4688751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:28:54 crc kubenswrapper[4693]: I1125 12:28:54.849638 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01b8bcbf-77a4-466c-8bf1-c320a4688751-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.395884 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01b8bcbf-77a4-466c-8bf1-c320a4688751","Type":"ContainerDied","Data":"1be8e7109e2025eec165e3c9113d9001dad7978a131eadfa8ef21f4a26118405"} Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.395936 4693 scope.go:117] "RemoveContainer" containerID="1d445022f99729d7f3d390e58ee942f5680caa0e8f8336440e5d6344c6513c52" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.396054 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.430559 4693 scope.go:117] "RemoveContainer" containerID="d6d3ba9a306b5bbc79bc207e344574203fa48faadb2a04c619ce99ebd74a8f23" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.430753 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.437981 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.458769 4693 scope.go:117] "RemoveContainer" containerID="585d9913cf8c6ced88b4e7219c95f02c87e0d6483f1f486a21591c87558289ed" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.464436 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:55 crc kubenswrapper[4693]: E1125 12:28:55.464884 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="proxy-httpd" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.464912 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="proxy-httpd" Nov 25 12:28:55 crc kubenswrapper[4693]: E1125 12:28:55.464942 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="ceilometer-notification-agent" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.464957 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="ceilometer-notification-agent" Nov 25 12:28:55 crc kubenswrapper[4693]: E1125 12:28:55.464984 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="ceilometer-central-agent" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.464996 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="ceilometer-central-agent" Nov 25 12:28:55 crc kubenswrapper[4693]: E1125 12:28:55.465025 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="sg-core" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.465037 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="sg-core" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.465334 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="sg-core" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.465360 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="ceilometer-notification-agent" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.465413 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="ceilometer-central-agent" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.465437 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" containerName="proxy-httpd" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.467691 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.470625 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.477046 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.489840 4693 scope.go:117] "RemoveContainer" containerID="56f77ea6a467fc6a485602ac51ff24e78354d38fa68b6ccbbf965e0722649cdf" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.495369 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.561401 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.561553 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/100b892d-b171-4ed1-a355-fc4e59d989a0-run-httpd\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.561614 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-scripts\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.561668 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5qs2\" (UniqueName: \"kubernetes.io/projected/100b892d-b171-4ed1-a355-fc4e59d989a0-kube-api-access-f5qs2\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.561712 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/100b892d-b171-4ed1-a355-fc4e59d989a0-log-httpd\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.561764 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.561841 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-config-data\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.663495 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/100b892d-b171-4ed1-a355-fc4e59d989a0-run-httpd\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.663573 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-scripts\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.663618 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5qs2\" (UniqueName: \"kubernetes.io/projected/100b892d-b171-4ed1-a355-fc4e59d989a0-kube-api-access-f5qs2\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.663660 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/100b892d-b171-4ed1-a355-fc4e59d989a0-log-httpd\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.663702 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.663731 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-config-data\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.663822 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.664065 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/100b892d-b171-4ed1-a355-fc4e59d989a0-run-httpd\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.664255 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/100b892d-b171-4ed1-a355-fc4e59d989a0-log-httpd\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.668043 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.668475 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-scripts\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.668493 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-config-data\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.671435 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.683259 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5qs2\" (UniqueName: \"kubernetes.io/projected/100b892d-b171-4ed1-a355-fc4e59d989a0-kube-api-access-f5qs2\") pod \"ceilometer-0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " pod="openstack/ceilometer-0" Nov 25 12:28:55 crc kubenswrapper[4693]: I1125 12:28:55.797636 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:28:56 crc kubenswrapper[4693]: I1125 12:28:56.272093 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:28:56 crc kubenswrapper[4693]: I1125 12:28:56.275532 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 12:28:56 crc kubenswrapper[4693]: I1125 12:28:56.408880 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"100b892d-b171-4ed1-a355-fc4e59d989a0","Type":"ContainerStarted","Data":"4b5a0e23ba86e17d7cb145dec0b016c117af3700094f548ffa241622dc6cd929"} Nov 25 12:28:56 crc kubenswrapper[4693]: I1125 12:28:56.823549 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01b8bcbf-77a4-466c-8bf1-c320a4688751" path="/var/lib/kubelet/pods/01b8bcbf-77a4-466c-8bf1-c320a4688751/volumes" Nov 25 12:28:58 crc kubenswrapper[4693]: I1125 12:28:58.432852 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"100b892d-b171-4ed1-a355-fc4e59d989a0","Type":"ContainerStarted","Data":"e9ca68dbc65fb897e09f9f5d8f854603d26c4aa7d019cf824392742f77a0b365"} Nov 25 12:28:58 crc kubenswrapper[4693]: I1125 12:28:58.433694 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"100b892d-b171-4ed1-a355-fc4e59d989a0","Type":"ContainerStarted","Data":"87c41bb256d68f6bf3bf4d5a50aed7fda933bef5e360aa6a807460997d464589"} Nov 25 12:28:59 crc kubenswrapper[4693]: I1125 12:28:59.133046 4693 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod60b0e117-e476-4ea8-b9e8-6cd21f6917a9"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod60b0e117-e476-4ea8-b9e8-6cd21f6917a9] : Timed out while waiting for systemd to remove kubepods-besteffort-pod60b0e117_e476_4ea8_b9e8_6cd21f6917a9.slice" Nov 25 12:28:59 crc kubenswrapper[4693]: I1125 12:28:59.442321 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"100b892d-b171-4ed1-a355-fc4e59d989a0","Type":"ContainerStarted","Data":"ffa259d9c0f30021eddc950d93a0f9865116860b783849e88c75a3be29c8abf6"} Nov 25 12:29:00 crc kubenswrapper[4693]: I1125 12:29:00.454919 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"100b892d-b171-4ed1-a355-fc4e59d989a0","Type":"ContainerStarted","Data":"fa721a39c46572d7fa37e6fac24e52af88798f03bb33bcdcdb09c2ceaf5690a2"} Nov 25 12:29:00 crc kubenswrapper[4693]: I1125 12:29:00.455356 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 12:29:00 crc kubenswrapper[4693]: I1125 12:29:00.475764 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.535248892 podStartE2EDuration="5.475744714s" podCreationTimestamp="2025-11-25 12:28:55 +0000 UTC" firstStartedPulling="2025-11-25 12:28:56.275313886 +0000 UTC m=+1256.193399267" lastFinishedPulling="2025-11-25 12:29:00.215809708 +0000 UTC m=+1260.133895089" observedRunningTime="2025-11-25 12:29:00.47417694 +0000 UTC m=+1260.392262321" watchObservedRunningTime="2025-11-25 12:29:00.475744714 +0000 UTC m=+1260.393830095" Nov 25 12:29:02 crc kubenswrapper[4693]: I1125 12:29:02.473887 4693 generic.go:334] "Generic (PLEG): container finished" podID="49348558-30c7-450b-978b-0be5ea427c08" containerID="87bc75025aae6c2df3afa013ecdd94fc884d4fca980e665e3058c7afd5fac2f8" exitCode=0 Nov 25 12:29:02 crc kubenswrapper[4693]: I1125 12:29:02.474011 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mwsdc" event={"ID":"49348558-30c7-450b-978b-0be5ea427c08","Type":"ContainerDied","Data":"87bc75025aae6c2df3afa013ecdd94fc884d4fca980e665e3058c7afd5fac2f8"} Nov 25 12:29:03 crc kubenswrapper[4693]: I1125 12:29:03.808224 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:29:03 crc kubenswrapper[4693]: I1125 12:29:03.943211 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-scripts\") pod \"49348558-30c7-450b-978b-0be5ea427c08\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " Nov 25 12:29:03 crc kubenswrapper[4693]: I1125 12:29:03.943279 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnmpz\" (UniqueName: \"kubernetes.io/projected/49348558-30c7-450b-978b-0be5ea427c08-kube-api-access-bnmpz\") pod \"49348558-30c7-450b-978b-0be5ea427c08\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " Nov 25 12:29:03 crc kubenswrapper[4693]: I1125 12:29:03.943336 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-combined-ca-bundle\") pod \"49348558-30c7-450b-978b-0be5ea427c08\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " Nov 25 12:29:03 crc kubenswrapper[4693]: I1125 12:29:03.943418 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-config-data\") pod \"49348558-30c7-450b-978b-0be5ea427c08\" (UID: \"49348558-30c7-450b-978b-0be5ea427c08\") " Nov 25 12:29:03 crc kubenswrapper[4693]: I1125 12:29:03.949483 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49348558-30c7-450b-978b-0be5ea427c08-kube-api-access-bnmpz" (OuterVolumeSpecName: "kube-api-access-bnmpz") pod "49348558-30c7-450b-978b-0be5ea427c08" (UID: "49348558-30c7-450b-978b-0be5ea427c08"). InnerVolumeSpecName "kube-api-access-bnmpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:29:03 crc kubenswrapper[4693]: I1125 12:29:03.962778 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-scripts" (OuterVolumeSpecName: "scripts") pod "49348558-30c7-450b-978b-0be5ea427c08" (UID: "49348558-30c7-450b-978b-0be5ea427c08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:03 crc kubenswrapper[4693]: I1125 12:29:03.976958 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49348558-30c7-450b-978b-0be5ea427c08" (UID: "49348558-30c7-450b-978b-0be5ea427c08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:03 crc kubenswrapper[4693]: I1125 12:29:03.995182 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-config-data" (OuterVolumeSpecName: "config-data") pod "49348558-30c7-450b-978b-0be5ea427c08" (UID: "49348558-30c7-450b-978b-0be5ea427c08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.046626 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.046665 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnmpz\" (UniqueName: \"kubernetes.io/projected/49348558-30c7-450b-978b-0be5ea427c08-kube-api-access-bnmpz\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.046683 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.046695 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49348558-30c7-450b-978b-0be5ea427c08-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.498343 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mwsdc" event={"ID":"49348558-30c7-450b-978b-0be5ea427c08","Type":"ContainerDied","Data":"2fd050966a2cb0d36393506ca22cfcf8707652c8e929ce8c8869ddb126cf863e"} Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.498468 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fd050966a2cb0d36393506ca22cfcf8707652c8e929ce8c8869ddb126cf863e" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.498381 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mwsdc" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.613143 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 25 12:29:04 crc kubenswrapper[4693]: E1125 12:29:04.613711 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49348558-30c7-450b-978b-0be5ea427c08" containerName="nova-cell0-conductor-db-sync" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.613731 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="49348558-30c7-450b-978b-0be5ea427c08" containerName="nova-cell0-conductor-db-sync" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.613912 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="49348558-30c7-450b-978b-0be5ea427c08" containerName="nova-cell0-conductor-db-sync" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.614507 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.616406 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.616600 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j4vmt" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.622451 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.764055 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc79fdf-d996-42ba-b250-2501738ed0bc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bdc79fdf-d996-42ba-b250-2501738ed0bc\") " pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.764429 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc79fdf-d996-42ba-b250-2501738ed0bc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bdc79fdf-d996-42ba-b250-2501738ed0bc\") " pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.764611 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hncns\" (UniqueName: \"kubernetes.io/projected/bdc79fdf-d996-42ba-b250-2501738ed0bc-kube-api-access-hncns\") pod \"nova-cell0-conductor-0\" (UID: \"bdc79fdf-d996-42ba-b250-2501738ed0bc\") " pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.866785 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hncns\" (UniqueName: \"kubernetes.io/projected/bdc79fdf-d996-42ba-b250-2501738ed0bc-kube-api-access-hncns\") pod \"nova-cell0-conductor-0\" (UID: \"bdc79fdf-d996-42ba-b250-2501738ed0bc\") " pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.867194 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc79fdf-d996-42ba-b250-2501738ed0bc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bdc79fdf-d996-42ba-b250-2501738ed0bc\") " pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.867851 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc79fdf-d996-42ba-b250-2501738ed0bc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bdc79fdf-d996-42ba-b250-2501738ed0bc\") " pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.871555 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc79fdf-d996-42ba-b250-2501738ed0bc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bdc79fdf-d996-42ba-b250-2501738ed0bc\") " pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.872134 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc79fdf-d996-42ba-b250-2501738ed0bc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bdc79fdf-d996-42ba-b250-2501738ed0bc\") " pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.893996 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hncns\" (UniqueName: \"kubernetes.io/projected/bdc79fdf-d996-42ba-b250-2501738ed0bc-kube-api-access-hncns\") pod \"nova-cell0-conductor-0\" (UID: \"bdc79fdf-d996-42ba-b250-2501738ed0bc\") " pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:04 crc kubenswrapper[4693]: I1125 12:29:04.975529 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:05 crc kubenswrapper[4693]: I1125 12:29:05.401702 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 25 12:29:05 crc kubenswrapper[4693]: I1125 12:29:05.510336 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bdc79fdf-d996-42ba-b250-2501738ed0bc","Type":"ContainerStarted","Data":"5d820bd9178c2a22ea3717252570bc04b2d875201ebddeaaf9e9a5a746039fb3"} Nov 25 12:29:06 crc kubenswrapper[4693]: I1125 12:29:06.525098 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bdc79fdf-d996-42ba-b250-2501738ed0bc","Type":"ContainerStarted","Data":"f2c97f6a2fb3393a1c3f70d3ace032655dc7ee2151cfa2b98193a5289342ecf3"} Nov 25 12:29:06 crc kubenswrapper[4693]: I1125 12:29:06.525416 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:06 crc kubenswrapper[4693]: I1125 12:29:06.555771 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.555750868 podStartE2EDuration="2.555750868s" podCreationTimestamp="2025-11-25 12:29:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:29:06.547468133 +0000 UTC m=+1266.465553544" watchObservedRunningTime="2025-11-25 12:29:06.555750868 +0000 UTC m=+1266.473836249" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.006770 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.579835 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-kfp5p"] Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.581257 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.583502 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.583661 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.588798 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kfp5p"] Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.667649 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-config-data\") pod \"nova-cell0-cell-mapping-kfp5p\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.667732 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kfp5p\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.667882 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgsnf\" (UniqueName: \"kubernetes.io/projected/505f634a-96dc-4bab-9cf6-416ca6ebf3df-kube-api-access-dgsnf\") pod \"nova-cell0-cell-mapping-kfp5p\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.667930 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-scripts\") pod \"nova-cell0-cell-mapping-kfp5p\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.732575 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.734812 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.740203 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.761899 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.770627 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsnf\" (UniqueName: \"kubernetes.io/projected/505f634a-96dc-4bab-9cf6-416ca6ebf3df-kube-api-access-dgsnf\") pod \"nova-cell0-cell-mapping-kfp5p\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.770686 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-scripts\") pod \"nova-cell0-cell-mapping-kfp5p\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.770732 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-config-data\") pod \"nova-cell0-cell-mapping-kfp5p\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.770763 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kfp5p\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.797169 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.798714 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.806520 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.813528 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kfp5p\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.813610 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-config-data\") pod \"nova-cell0-cell-mapping-kfp5p\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.813999 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-scripts\") pod \"nova-cell0-cell-mapping-kfp5p\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.814561 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.820957 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgsnf\" (UniqueName: \"kubernetes.io/projected/505f634a-96dc-4bab-9cf6-416ca6ebf3df-kube-api-access-dgsnf\") pod \"nova-cell0-cell-mapping-kfp5p\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.872507 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e368b0f-0bcb-4eff-931b-459adb726edc-logs\") pod \"nova-api-0\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " pod="openstack/nova-api-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.872595 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tssgt\" (UniqueName: \"kubernetes.io/projected/2e368b0f-0bcb-4eff-931b-459adb726edc-kube-api-access-tssgt\") pod \"nova-api-0\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " pod="openstack/nova-api-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.872647 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e368b0f-0bcb-4eff-931b-459adb726edc-config-data\") pod \"nova-api-0\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " pod="openstack/nova-api-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.872716 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e368b0f-0bcb-4eff-931b-459adb726edc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " pod="openstack/nova-api-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.913859 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.944102 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.955147 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.960278 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.970188 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.978518 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e368b0f-0bcb-4eff-931b-459adb726edc-logs\") pod \"nova-api-0\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " pod="openstack/nova-api-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.978576 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec29c9ae-a20a-4d71-abb9-100e510aed1b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.978618 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec29c9ae-a20a-4d71-abb9-100e510aed1b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.978666 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tssgt\" (UniqueName: \"kubernetes.io/projected/2e368b0f-0bcb-4eff-931b-459adb726edc-kube-api-access-tssgt\") pod \"nova-api-0\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " pod="openstack/nova-api-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.978698 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqwgd\" (UniqueName: \"kubernetes.io/projected/ec29c9ae-a20a-4d71-abb9-100e510aed1b-kube-api-access-vqwgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.978744 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e368b0f-0bcb-4eff-931b-459adb726edc-config-data\") pod \"nova-api-0\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " pod="openstack/nova-api-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.978842 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e368b0f-0bcb-4eff-931b-459adb726edc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " pod="openstack/nova-api-0" Nov 25 12:29:15 crc kubenswrapper[4693]: I1125 12:29:15.979877 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e368b0f-0bcb-4eff-931b-459adb726edc-logs\") pod \"nova-api-0\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " pod="openstack/nova-api-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.011316 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e368b0f-0bcb-4eff-931b-459adb726edc-config-data\") pod \"nova-api-0\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " pod="openstack/nova-api-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.011990 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e368b0f-0bcb-4eff-931b-459adb726edc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " pod="openstack/nova-api-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.031683 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tssgt\" (UniqueName: \"kubernetes.io/projected/2e368b0f-0bcb-4eff-931b-459adb726edc-kube-api-access-tssgt\") pod \"nova-api-0\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " pod="openstack/nova-api-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.035163 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.036561 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.042452 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.069824 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.086094 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-logs\") pod \"nova-metadata-0\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.086152 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhg7l\" (UniqueName: \"kubernetes.io/projected/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-kube-api-access-vhg7l\") pod \"nova-metadata-0\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.086249 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-config-data\") pod \"nova-metadata-0\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.086283 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec29c9ae-a20a-4d71-abb9-100e510aed1b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.086318 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec29c9ae-a20a-4d71-abb9-100e510aed1b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.086401 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqwgd\" (UniqueName: \"kubernetes.io/projected/ec29c9ae-a20a-4d71-abb9-100e510aed1b-kube-api-access-vqwgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.086472 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.090910 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec29c9ae-a20a-4d71-abb9-100e510aed1b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.090999 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec29c9ae-a20a-4d71-abb9-100e510aed1b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.110912 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.139964 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqwgd\" (UniqueName: \"kubernetes.io/projected/ec29c9ae-a20a-4d71-abb9-100e510aed1b-kube-api-access-vqwgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.144963 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64dbf5859c-4hs4t"] Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.147246 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.153333 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64dbf5859c-4hs4t"] Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.189665 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-logs\") pod \"nova-metadata-0\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.189732 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhg7l\" (UniqueName: \"kubernetes.io/projected/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-kube-api-access-vhg7l\") pod \"nova-metadata-0\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.189797 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.189838 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-config-data\") pod \"nova-metadata-0\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.189899 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qxd7\" (UniqueName: \"kubernetes.io/projected/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-kube-api-access-5qxd7\") pod \"nova-scheduler-0\" (UID: \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.189937 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-config-data\") pod \"nova-scheduler-0\" (UID: \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.189988 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.208831 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-logs\") pod \"nova-metadata-0\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.209264 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.209440 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-config-data\") pod \"nova-metadata-0\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.224523 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.227552 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhg7l\" (UniqueName: \"kubernetes.io/projected/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-kube-api-access-vhg7l\") pod \"nova-metadata-0\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.297481 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-dns-svc\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.297783 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-dns-swift-storage-0\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.297820 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.297870 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-config\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.297916 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qxd7\" (UniqueName: \"kubernetes.io/projected/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-kube-api-access-5qxd7\") pod \"nova-scheduler-0\" (UID: \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.297936 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-ovsdbserver-nb\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.297953 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-ovsdbserver-sb\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.297969 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-config-data\") pod \"nova-scheduler-0\" (UID: \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.298011 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqx48\" (UniqueName: \"kubernetes.io/projected/c5ad2242-431b-4d4e-a815-3623305d8b38-kube-api-access-qqx48\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.304321 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.314026 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-config-data\") pod \"nova-scheduler-0\" (UID: \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.322534 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qxd7\" (UniqueName: \"kubernetes.io/projected/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-kube-api-access-5qxd7\") pod \"nova-scheduler-0\" (UID: \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.399937 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-dns-svc\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.400005 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-dns-swift-storage-0\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.400101 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-config\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.400147 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-ovsdbserver-nb\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.400169 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-ovsdbserver-sb\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.400194 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqx48\" (UniqueName: \"kubernetes.io/projected/c5ad2242-431b-4d4e-a815-3623305d8b38-kube-api-access-qqx48\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.401292 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-dns-svc\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.402530 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-config\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.404096 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-ovsdbserver-sb\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.408124 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-dns-swift-storage-0\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.409112 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-ovsdbserver-nb\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.422000 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqx48\" (UniqueName: \"kubernetes.io/projected/c5ad2242-431b-4d4e-a815-3623305d8b38-kube-api-access-qqx48\") pod \"dnsmasq-dns-64dbf5859c-4hs4t\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.462316 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.518089 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.540360 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.673490 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kfp5p"] Nov 25 12:29:16 crc kubenswrapper[4693]: W1125 12:29:16.698257 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod505f634a_96dc_4bab_9cf6_416ca6ebf3df.slice/crio-8487278845e84bd55e9e10195113fc601a01316ac587cc5897b9192157676ff0 WatchSource:0}: Error finding container 8487278845e84bd55e9e10195113fc601a01316ac587cc5897b9192157676ff0: Status 404 returned error can't find the container with id 8487278845e84bd55e9e10195113fc601a01316ac587cc5897b9192157676ff0 Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.758625 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-z88lk"] Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.759981 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.762747 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.762945 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.899022 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-z88lk"] Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.900833 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.910007 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4d9z\" (UniqueName: \"kubernetes.io/projected/a868736f-308b-4590-966c-f4d01a5da39a-kube-api-access-d4d9z\") pod \"nova-cell1-conductor-db-sync-z88lk\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.910414 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-scripts\") pod \"nova-cell1-conductor-db-sync-z88lk\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.913987 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-z88lk\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.914336 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-config-data\") pod \"nova-cell1-conductor-db-sync-z88lk\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:16 crc kubenswrapper[4693]: I1125 12:29:16.923739 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.016337 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4d9z\" (UniqueName: \"kubernetes.io/projected/a868736f-308b-4590-966c-f4d01a5da39a-kube-api-access-d4d9z\") pod \"nova-cell1-conductor-db-sync-z88lk\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.016709 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-scripts\") pod \"nova-cell1-conductor-db-sync-z88lk\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.016770 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-z88lk\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.016954 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-config-data\") pod \"nova-cell1-conductor-db-sync-z88lk\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.021673 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-scripts\") pod \"nova-cell1-conductor-db-sync-z88lk\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.030710 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-config-data\") pod \"nova-cell1-conductor-db-sync-z88lk\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.032602 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-z88lk\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.035051 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4d9z\" (UniqueName: \"kubernetes.io/projected/a868736f-308b-4590-966c-f4d01a5da39a-kube-api-access-d4d9z\") pod \"nova-cell1-conductor-db-sync-z88lk\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.187739 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:29:17 crc kubenswrapper[4693]: W1125 12:29:17.191582 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7fa6d95_c560_46bb_b8f7_c63c13c3b2e9.slice/crio-b6fef8232f276954c95579a4ccd4f2a39f0e3c3711005711a601d3351db40493 WatchSource:0}: Error finding container b6fef8232f276954c95579a4ccd4f2a39f0e3c3711005711a601d3351db40493: Status 404 returned error can't find the container with id b6fef8232f276954c95579a4ccd4f2a39f0e3c3711005711a601d3351db40493 Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.253540 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.258150 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.267663 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64dbf5859c-4hs4t"] Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.653034 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9","Type":"ContainerStarted","Data":"b6fef8232f276954c95579a4ccd4f2a39f0e3c3711005711a601d3351db40493"} Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.656509 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kfp5p" event={"ID":"505f634a-96dc-4bab-9cf6-416ca6ebf3df","Type":"ContainerStarted","Data":"ab450c1bca8d4252882da7e80641646f1e20abfff53a33a0fd9b53a68b96f151"} Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.656540 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kfp5p" event={"ID":"505f634a-96dc-4bab-9cf6-416ca6ebf3df","Type":"ContainerStarted","Data":"8487278845e84bd55e9e10195113fc601a01316ac587cc5897b9192157676ff0"} Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.658591 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfdc7627-8e1b-4ed0-a2f6-56a55629340e","Type":"ContainerStarted","Data":"6ed20ed943ec11fc2986973f0b3680835ff9e41da0fcf3db2fc43cc535cfd5cf"} Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.662389 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ec29c9ae-a20a-4d71-abb9-100e510aed1b","Type":"ContainerStarted","Data":"664960b2a4b16deb27a3af4b6e37c71a98af73f66fbc9a579c7cfe8a18b261c0"} Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.666815 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" event={"ID":"c5ad2242-431b-4d4e-a815-3623305d8b38","Type":"ContainerStarted","Data":"a9032ae2a779d2ac3d3906ec30f878d49f24bf7839634ffab00d9bfc0501927c"} Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.666900 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" event={"ID":"c5ad2242-431b-4d4e-a815-3623305d8b38","Type":"ContainerStarted","Data":"8524ccdf01b8a0729334854aad167cf93d4a598c1750cba3463173fc7e169434"} Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.683934 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e368b0f-0bcb-4eff-931b-459adb726edc","Type":"ContainerStarted","Data":"530e28b46cfafbfe1f23242236d88dd3b1e5810318c33d003043fa807d0b2f6e"} Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.708570 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-kfp5p" podStartSLOduration=2.708548249 podStartE2EDuration="2.708548249s" podCreationTimestamp="2025-11-25 12:29:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:29:17.674752051 +0000 UTC m=+1277.592837432" watchObservedRunningTime="2025-11-25 12:29:17.708548249 +0000 UTC m=+1277.626633630" Nov 25 12:29:17 crc kubenswrapper[4693]: I1125 12:29:17.901187 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-z88lk"] Nov 25 12:29:17 crc kubenswrapper[4693]: W1125 12:29:17.915909 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda868736f_308b_4590_966c_f4d01a5da39a.slice/crio-740d4e78719b048994ce729f91a59853992ebc9924f283ef81a851f607ddf138 WatchSource:0}: Error finding container 740d4e78719b048994ce729f91a59853992ebc9924f283ef81a851f607ddf138: Status 404 returned error can't find the container with id 740d4e78719b048994ce729f91a59853992ebc9924f283ef81a851f607ddf138 Nov 25 12:29:18 crc kubenswrapper[4693]: I1125 12:29:18.713542 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-z88lk" event={"ID":"a868736f-308b-4590-966c-f4d01a5da39a","Type":"ContainerStarted","Data":"26b3a27127a7b5b1d1071a03e51623f71ae97f28dbd4bb717c5511160467647e"} Nov 25 12:29:18 crc kubenswrapper[4693]: I1125 12:29:18.713867 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-z88lk" event={"ID":"a868736f-308b-4590-966c-f4d01a5da39a","Type":"ContainerStarted","Data":"740d4e78719b048994ce729f91a59853992ebc9924f283ef81a851f607ddf138"} Nov 25 12:29:18 crc kubenswrapper[4693]: I1125 12:29:18.719643 4693 generic.go:334] "Generic (PLEG): container finished" podID="c5ad2242-431b-4d4e-a815-3623305d8b38" containerID="a9032ae2a779d2ac3d3906ec30f878d49f24bf7839634ffab00d9bfc0501927c" exitCode=0 Nov 25 12:29:18 crc kubenswrapper[4693]: I1125 12:29:18.720859 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" event={"ID":"c5ad2242-431b-4d4e-a815-3623305d8b38","Type":"ContainerDied","Data":"a9032ae2a779d2ac3d3906ec30f878d49f24bf7839634ffab00d9bfc0501927c"} Nov 25 12:29:18 crc kubenswrapper[4693]: I1125 12:29:18.720887 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" event={"ID":"c5ad2242-431b-4d4e-a815-3623305d8b38","Type":"ContainerStarted","Data":"6aa0e19f9203e8e6ad67c47c41e5b12c1bea14ca0b0e7d77efbd711698dab350"} Nov 25 12:29:18 crc kubenswrapper[4693]: I1125 12:29:18.720901 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:18 crc kubenswrapper[4693]: I1125 12:29:18.741239 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-z88lk" podStartSLOduration=2.741216299 podStartE2EDuration="2.741216299s" podCreationTimestamp="2025-11-25 12:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:29:18.731557945 +0000 UTC m=+1278.649643326" watchObservedRunningTime="2025-11-25 12:29:18.741216299 +0000 UTC m=+1278.659301680" Nov 25 12:29:18 crc kubenswrapper[4693]: I1125 12:29:18.755537 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" podStartSLOduration=2.755519815 podStartE2EDuration="2.755519815s" podCreationTimestamp="2025-11-25 12:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:29:18.754975719 +0000 UTC m=+1278.673061120" watchObservedRunningTime="2025-11-25 12:29:18.755519815 +0000 UTC m=+1278.673605196" Nov 25 12:29:19 crc kubenswrapper[4693]: I1125 12:29:19.586039 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 12:29:19 crc kubenswrapper[4693]: I1125 12:29:19.596068 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:21 crc kubenswrapper[4693]: I1125 12:29:21.752406 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e368b0f-0bcb-4eff-931b-459adb726edc","Type":"ContainerStarted","Data":"ec66add43d659acb958ccbfe8c3d60e28b48989d9a3d0cff14b3f0bd843e39a8"} Nov 25 12:29:21 crc kubenswrapper[4693]: I1125 12:29:21.754084 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9","Type":"ContainerStarted","Data":"48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7"} Nov 25 12:29:21 crc kubenswrapper[4693]: I1125 12:29:21.756943 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfdc7627-8e1b-4ed0-a2f6-56a55629340e","Type":"ContainerStarted","Data":"4a68bd515c72d9e25bfbc68f09254063d96392dbe93a4e35febc03b1ad303a03"} Nov 25 12:29:21 crc kubenswrapper[4693]: I1125 12:29:21.759000 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ec29c9ae-a20a-4d71-abb9-100e510aed1b","Type":"ContainerStarted","Data":"1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042"} Nov 25 12:29:21 crc kubenswrapper[4693]: I1125 12:29:21.759119 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ec29c9ae-a20a-4d71-abb9-100e510aed1b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042" gracePeriod=30 Nov 25 12:29:21 crc kubenswrapper[4693]: I1125 12:29:21.784953 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.789679127 podStartE2EDuration="6.784927761s" podCreationTimestamp="2025-11-25 12:29:15 +0000 UTC" firstStartedPulling="2025-11-25 12:29:17.193140485 +0000 UTC m=+1277.111225876" lastFinishedPulling="2025-11-25 12:29:21.188389129 +0000 UTC m=+1281.106474510" observedRunningTime="2025-11-25 12:29:21.77145227 +0000 UTC m=+1281.689537661" watchObservedRunningTime="2025-11-25 12:29:21.784927761 +0000 UTC m=+1281.703013142" Nov 25 12:29:21 crc kubenswrapper[4693]: I1125 12:29:21.801158 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.469743632 podStartE2EDuration="6.801133991s" podCreationTimestamp="2025-11-25 12:29:15 +0000 UTC" firstStartedPulling="2025-11-25 12:29:16.856492246 +0000 UTC m=+1276.774577627" lastFinishedPulling="2025-11-25 12:29:21.187882605 +0000 UTC m=+1281.105967986" observedRunningTime="2025-11-25 12:29:21.791250311 +0000 UTC m=+1281.709335682" watchObservedRunningTime="2025-11-25 12:29:21.801133991 +0000 UTC m=+1281.719219372" Nov 25 12:29:22 crc kubenswrapper[4693]: I1125 12:29:22.790649 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e368b0f-0bcb-4eff-931b-459adb726edc","Type":"ContainerStarted","Data":"68292962c3ed0592fe38c82e87d936b6aced0415ec8f9c18368e1917dafc4ee4"} Nov 25 12:29:22 crc kubenswrapper[4693]: I1125 12:29:22.795573 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfdc7627-8e1b-4ed0-a2f6-56a55629340e","Type":"ContainerStarted","Data":"d4aec743d89e427e7a3053c368a5051515b48735a39d3153ddfb2a5358a94d39"} Nov 25 12:29:22 crc kubenswrapper[4693]: I1125 12:29:22.795741 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cfdc7627-8e1b-4ed0-a2f6-56a55629340e" containerName="nova-metadata-log" containerID="cri-o://4a68bd515c72d9e25bfbc68f09254063d96392dbe93a4e35febc03b1ad303a03" gracePeriod=30 Nov 25 12:29:22 crc kubenswrapper[4693]: I1125 12:29:22.795863 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cfdc7627-8e1b-4ed0-a2f6-56a55629340e" containerName="nova-metadata-metadata" containerID="cri-o://d4aec743d89e427e7a3053c368a5051515b48735a39d3153ddfb2a5358a94d39" gracePeriod=30 Nov 25 12:29:22 crc kubenswrapper[4693]: I1125 12:29:22.818416 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.457287483 podStartE2EDuration="7.818395944s" podCreationTimestamp="2025-11-25 12:29:15 +0000 UTC" firstStartedPulling="2025-11-25 12:29:16.834136492 +0000 UTC m=+1276.752221873" lastFinishedPulling="2025-11-25 12:29:21.195244953 +0000 UTC m=+1281.113330334" observedRunningTime="2025-11-25 12:29:22.810292205 +0000 UTC m=+1282.728377596" watchObservedRunningTime="2025-11-25 12:29:22.818395944 +0000 UTC m=+1282.736481345" Nov 25 12:29:22 crc kubenswrapper[4693]: I1125 12:29:22.845163 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.949450308 podStartE2EDuration="7.845141692s" podCreationTimestamp="2025-11-25 12:29:15 +0000 UTC" firstStartedPulling="2025-11-25 12:29:17.29358395 +0000 UTC m=+1277.211669331" lastFinishedPulling="2025-11-25 12:29:21.189275334 +0000 UTC m=+1281.107360715" observedRunningTime="2025-11-25 12:29:22.838958887 +0000 UTC m=+1282.757044258" watchObservedRunningTime="2025-11-25 12:29:22.845141692 +0000 UTC m=+1282.763227073" Nov 25 12:29:23 crc kubenswrapper[4693]: E1125 12:29:23.314518 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfdc7627_8e1b_4ed0_a2f6_56a55629340e.slice/crio-conmon-d4aec743d89e427e7a3053c368a5051515b48735a39d3153ddfb2a5358a94d39.scope\": RecentStats: unable to find data in memory cache]" Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.808529 4693 generic.go:334] "Generic (PLEG): container finished" podID="cfdc7627-8e1b-4ed0-a2f6-56a55629340e" containerID="d4aec743d89e427e7a3053c368a5051515b48735a39d3153ddfb2a5358a94d39" exitCode=0 Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.808815 4693 generic.go:334] "Generic (PLEG): container finished" podID="cfdc7627-8e1b-4ed0-a2f6-56a55629340e" containerID="4a68bd515c72d9e25bfbc68f09254063d96392dbe93a4e35febc03b1ad303a03" exitCode=143 Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.808580 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfdc7627-8e1b-4ed0-a2f6-56a55629340e","Type":"ContainerDied","Data":"d4aec743d89e427e7a3053c368a5051515b48735a39d3153ddfb2a5358a94d39"} Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.808960 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfdc7627-8e1b-4ed0-a2f6-56a55629340e","Type":"ContainerDied","Data":"4a68bd515c72d9e25bfbc68f09254063d96392dbe93a4e35febc03b1ad303a03"} Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.808978 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cfdc7627-8e1b-4ed0-a2f6-56a55629340e","Type":"ContainerDied","Data":"6ed20ed943ec11fc2986973f0b3680835ff9e41da0fcf3db2fc43cc535cfd5cf"} Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.808994 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ed20ed943ec11fc2986973f0b3680835ff9e41da0fcf3db2fc43cc535cfd5cf" Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.889849 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.975339 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-logs\") pod \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.975742 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-logs" (OuterVolumeSpecName: "logs") pod "cfdc7627-8e1b-4ed0-a2f6-56a55629340e" (UID: "cfdc7627-8e1b-4ed0-a2f6-56a55629340e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.975877 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-combined-ca-bundle\") pod \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.976703 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhg7l\" (UniqueName: \"kubernetes.io/projected/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-kube-api-access-vhg7l\") pod \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.976822 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-config-data\") pod \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\" (UID: \"cfdc7627-8e1b-4ed0-a2f6-56a55629340e\") " Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.977421 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:23 crc kubenswrapper[4693]: I1125 12:29:23.991911 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-kube-api-access-vhg7l" (OuterVolumeSpecName: "kube-api-access-vhg7l") pod "cfdc7627-8e1b-4ed0-a2f6-56a55629340e" (UID: "cfdc7627-8e1b-4ed0-a2f6-56a55629340e"). InnerVolumeSpecName "kube-api-access-vhg7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.003616 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfdc7627-8e1b-4ed0-a2f6-56a55629340e" (UID: "cfdc7627-8e1b-4ed0-a2f6-56a55629340e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.006317 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-config-data" (OuterVolumeSpecName: "config-data") pod "cfdc7627-8e1b-4ed0-a2f6-56a55629340e" (UID: "cfdc7627-8e1b-4ed0-a2f6-56a55629340e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.079330 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.079385 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.079398 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhg7l\" (UniqueName: \"kubernetes.io/projected/cfdc7627-8e1b-4ed0-a2f6-56a55629340e-kube-api-access-vhg7l\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.817163 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.858564 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.871405 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.905953 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:24 crc kubenswrapper[4693]: E1125 12:29:24.906832 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfdc7627-8e1b-4ed0-a2f6-56a55629340e" containerName="nova-metadata-metadata" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.906859 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfdc7627-8e1b-4ed0-a2f6-56a55629340e" containerName="nova-metadata-metadata" Nov 25 12:29:24 crc kubenswrapper[4693]: E1125 12:29:24.906929 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfdc7627-8e1b-4ed0-a2f6-56a55629340e" containerName="nova-metadata-log" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.906942 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfdc7627-8e1b-4ed0-a2f6-56a55629340e" containerName="nova-metadata-log" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.907439 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfdc7627-8e1b-4ed0-a2f6-56a55629340e" containerName="nova-metadata-metadata" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.907479 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfdc7627-8e1b-4ed0-a2f6-56a55629340e" containerName="nova-metadata-log" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.909516 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.917295 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.917557 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 12:29:24 crc kubenswrapper[4693]: I1125 12:29:24.929250 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.008989 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/babef220-5cd7-4d0a-8281-f4f27f8188ba-logs\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.009124 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.009170 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd4xb\" (UniqueName: \"kubernetes.io/projected/babef220-5cd7-4d0a-8281-f4f27f8188ba-kube-api-access-pd4xb\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.009225 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.009305 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-config-data\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.111410 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.111513 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd4xb\" (UniqueName: \"kubernetes.io/projected/babef220-5cd7-4d0a-8281-f4f27f8188ba-kube-api-access-pd4xb\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.111585 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.111647 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-config-data\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.111676 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/babef220-5cd7-4d0a-8281-f4f27f8188ba-logs\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.112333 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/babef220-5cd7-4d0a-8281-f4f27f8188ba-logs\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.117542 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.117861 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.120100 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-config-data\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.131540 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd4xb\" (UniqueName: \"kubernetes.io/projected/babef220-5cd7-4d0a-8281-f4f27f8188ba-kube-api-access-pd4xb\") pod \"nova-metadata-0\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.231297 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.688692 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:25 crc kubenswrapper[4693]: W1125 12:29:25.693740 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbabef220_5cd7_4d0a_8281_f4f27f8188ba.slice/crio-98a341b5ead2caae61ff8688326fbaaf91d6efd8eb691992975937cde4b55f46 WatchSource:0}: Error finding container 98a341b5ead2caae61ff8688326fbaaf91d6efd8eb691992975937cde4b55f46: Status 404 returned error can't find the container with id 98a341b5ead2caae61ff8688326fbaaf91d6efd8eb691992975937cde4b55f46 Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.811689 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 25 12:29:25 crc kubenswrapper[4693]: I1125 12:29:25.832164 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"babef220-5cd7-4d0a-8281-f4f27f8188ba","Type":"ContainerStarted","Data":"98a341b5ead2caae61ff8688326fbaaf91d6efd8eb691992975937cde4b55f46"} Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.071884 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.073529 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.225266 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.519160 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.520469 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.542533 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.550955 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.636279 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7965876c4f-pqjb5"] Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.636631 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" podUID="4e5da329-8105-4bd9-a340-65d273d51bbf" containerName="dnsmasq-dns" containerID="cri-o://2b59293260077cf5251994d285be38340a2349fa8b1d78ef6227a85fd010b5df" gracePeriod=10 Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.827707 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfdc7627-8e1b-4ed0-a2f6-56a55629340e" path="/var/lib/kubelet/pods/cfdc7627-8e1b-4ed0-a2f6-56a55629340e/volumes" Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.867967 4693 generic.go:334] "Generic (PLEG): container finished" podID="505f634a-96dc-4bab-9cf6-416ca6ebf3df" containerID="ab450c1bca8d4252882da7e80641646f1e20abfff53a33a0fd9b53a68b96f151" exitCode=0 Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.868349 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kfp5p" event={"ID":"505f634a-96dc-4bab-9cf6-416ca6ebf3df","Type":"ContainerDied","Data":"ab450c1bca8d4252882da7e80641646f1e20abfff53a33a0fd9b53a68b96f151"} Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.881025 4693 generic.go:334] "Generic (PLEG): container finished" podID="4e5da329-8105-4bd9-a340-65d273d51bbf" containerID="2b59293260077cf5251994d285be38340a2349fa8b1d78ef6227a85fd010b5df" exitCode=0 Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.883285 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" event={"ID":"4e5da329-8105-4bd9-a340-65d273d51bbf","Type":"ContainerDied","Data":"2b59293260077cf5251994d285be38340a2349fa8b1d78ef6227a85fd010b5df"} Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.894353 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"babef220-5cd7-4d0a-8281-f4f27f8188ba","Type":"ContainerStarted","Data":"fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67"} Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.894419 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"babef220-5cd7-4d0a-8281-f4f27f8188ba","Type":"ContainerStarted","Data":"fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6"} Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.936040 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.936020336 podStartE2EDuration="2.936020336s" podCreationTimestamp="2025-11-25 12:29:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:29:26.918677705 +0000 UTC m=+1286.836763086" watchObservedRunningTime="2025-11-25 12:29:26.936020336 +0000 UTC m=+1286.854105717" Nov 25 12:29:26 crc kubenswrapper[4693]: I1125 12:29:26.936827 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.159559 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2e368b0f-0bcb-4eff-931b-459adb726edc" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.159584 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2e368b0f-0bcb-4eff-931b-459adb726edc" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.280333 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.365533 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-dns-svc\") pod \"4e5da329-8105-4bd9-a340-65d273d51bbf\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.365701 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-dns-swift-storage-0\") pod \"4e5da329-8105-4bd9-a340-65d273d51bbf\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.366057 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-ovsdbserver-nb\") pod \"4e5da329-8105-4bd9-a340-65d273d51bbf\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.366141 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-ovsdbserver-sb\") pod \"4e5da329-8105-4bd9-a340-65d273d51bbf\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.366227 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-config\") pod \"4e5da329-8105-4bd9-a340-65d273d51bbf\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.366277 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkvfh\" (UniqueName: \"kubernetes.io/projected/4e5da329-8105-4bd9-a340-65d273d51bbf-kube-api-access-mkvfh\") pod \"4e5da329-8105-4bd9-a340-65d273d51bbf\" (UID: \"4e5da329-8105-4bd9-a340-65d273d51bbf\") " Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.395538 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5da329-8105-4bd9-a340-65d273d51bbf-kube-api-access-mkvfh" (OuterVolumeSpecName: "kube-api-access-mkvfh") pod "4e5da329-8105-4bd9-a340-65d273d51bbf" (UID: "4e5da329-8105-4bd9-a340-65d273d51bbf"). InnerVolumeSpecName "kube-api-access-mkvfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.428294 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e5da329-8105-4bd9-a340-65d273d51bbf" (UID: "4e5da329-8105-4bd9-a340-65d273d51bbf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.433263 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4e5da329-8105-4bd9-a340-65d273d51bbf" (UID: "4e5da329-8105-4bd9-a340-65d273d51bbf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.450978 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e5da329-8105-4bd9-a340-65d273d51bbf" (UID: "4e5da329-8105-4bd9-a340-65d273d51bbf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.470214 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.470257 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkvfh\" (UniqueName: \"kubernetes.io/projected/4e5da329-8105-4bd9-a340-65d273d51bbf-kube-api-access-mkvfh\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.470274 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.470334 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.480648 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-config" (OuterVolumeSpecName: "config") pod "4e5da329-8105-4bd9-a340-65d273d51bbf" (UID: "4e5da329-8105-4bd9-a340-65d273d51bbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.498868 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e5da329-8105-4bd9-a340-65d273d51bbf" (UID: "4e5da329-8105-4bd9-a340-65d273d51bbf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.572203 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.572246 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e5da329-8105-4bd9-a340-65d273d51bbf-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.903423 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" event={"ID":"4e5da329-8105-4bd9-a340-65d273d51bbf","Type":"ContainerDied","Data":"2f4c1995c4b40c60316dcf1a281667fd7060a2b5d387247fa09721791e0315c4"} Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.903473 4693 scope.go:117] "RemoveContainer" containerID="2b59293260077cf5251994d285be38340a2349fa8b1d78ef6227a85fd010b5df" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.903583 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7965876c4f-pqjb5" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.940886 4693 scope.go:117] "RemoveContainer" containerID="f06b4b7b5a2207d0b95f80c9eb232716289bd7501129a1ad16981ee9d8b4cf74" Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.955818 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7965876c4f-pqjb5"] Nov 25 12:29:27 crc kubenswrapper[4693]: I1125 12:29:27.971044 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7965876c4f-pqjb5"] Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.363580 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.501111 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-config-data\") pod \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.501259 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgsnf\" (UniqueName: \"kubernetes.io/projected/505f634a-96dc-4bab-9cf6-416ca6ebf3df-kube-api-access-dgsnf\") pod \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.501347 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-combined-ca-bundle\") pod \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.501489 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-scripts\") pod \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\" (UID: \"505f634a-96dc-4bab-9cf6-416ca6ebf3df\") " Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.506124 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-scripts" (OuterVolumeSpecName: "scripts") pod "505f634a-96dc-4bab-9cf6-416ca6ebf3df" (UID: "505f634a-96dc-4bab-9cf6-416ca6ebf3df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.519055 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505f634a-96dc-4bab-9cf6-416ca6ebf3df-kube-api-access-dgsnf" (OuterVolumeSpecName: "kube-api-access-dgsnf") pod "505f634a-96dc-4bab-9cf6-416ca6ebf3df" (UID: "505f634a-96dc-4bab-9cf6-416ca6ebf3df"). InnerVolumeSpecName "kube-api-access-dgsnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.567543 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "505f634a-96dc-4bab-9cf6-416ca6ebf3df" (UID: "505f634a-96dc-4bab-9cf6-416ca6ebf3df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.603888 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.603934 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.603948 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgsnf\" (UniqueName: \"kubernetes.io/projected/505f634a-96dc-4bab-9cf6-416ca6ebf3df-kube-api-access-dgsnf\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.631615 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-config-data" (OuterVolumeSpecName: "config-data") pod "505f634a-96dc-4bab-9cf6-416ca6ebf3df" (UID: "505f634a-96dc-4bab-9cf6-416ca6ebf3df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.705357 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/505f634a-96dc-4bab-9cf6-416ca6ebf3df-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.825573 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5da329-8105-4bd9-a340-65d273d51bbf" path="/var/lib/kubelet/pods/4e5da329-8105-4bd9-a340-65d273d51bbf/volumes" Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.915549 4693 generic.go:334] "Generic (PLEG): container finished" podID="a868736f-308b-4590-966c-f4d01a5da39a" containerID="26b3a27127a7b5b1d1071a03e51623f71ae97f28dbd4bb717c5511160467647e" exitCode=0 Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.915647 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-z88lk" event={"ID":"a868736f-308b-4590-966c-f4d01a5da39a","Type":"ContainerDied","Data":"26b3a27127a7b5b1d1071a03e51623f71ae97f28dbd4bb717c5511160467647e"} Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.920334 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kfp5p" event={"ID":"505f634a-96dc-4bab-9cf6-416ca6ebf3df","Type":"ContainerDied","Data":"8487278845e84bd55e9e10195113fc601a01316ac587cc5897b9192157676ff0"} Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.920398 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kfp5p" Nov 25 12:29:28 crc kubenswrapper[4693]: I1125 12:29:28.920670 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8487278845e84bd55e9e10195113fc601a01316ac587cc5897b9192157676ff0" Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.076821 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.077326 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2e368b0f-0bcb-4eff-931b-459adb726edc" containerName="nova-api-api" containerID="cri-o://68292962c3ed0592fe38c82e87d936b6aced0415ec8f9c18368e1917dafc4ee4" gracePeriod=30 Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.077770 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2e368b0f-0bcb-4eff-931b-459adb726edc" containerName="nova-api-log" containerID="cri-o://ec66add43d659acb958ccbfe8c3d60e28b48989d9a3d0cff14b3f0bd843e39a8" gracePeriod=30 Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.099806 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.112816 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.113078 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="babef220-5cd7-4d0a-8281-f4f27f8188ba" containerName="nova-metadata-log" containerID="cri-o://fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6" gracePeriod=30 Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.113152 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="babef220-5cd7-4d0a-8281-f4f27f8188ba" containerName="nova-metadata-metadata" containerID="cri-o://fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67" gracePeriod=30 Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.827184 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.930748 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-nova-metadata-tls-certs\") pod \"babef220-5cd7-4d0a-8281-f4f27f8188ba\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.930920 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/babef220-5cd7-4d0a-8281-f4f27f8188ba-logs\") pod \"babef220-5cd7-4d0a-8281-f4f27f8188ba\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.930938 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-combined-ca-bundle\") pod \"babef220-5cd7-4d0a-8281-f4f27f8188ba\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.930998 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-config-data\") pod \"babef220-5cd7-4d0a-8281-f4f27f8188ba\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.931017 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd4xb\" (UniqueName: \"kubernetes.io/projected/babef220-5cd7-4d0a-8281-f4f27f8188ba-kube-api-access-pd4xb\") pod \"babef220-5cd7-4d0a-8281-f4f27f8188ba\" (UID: \"babef220-5cd7-4d0a-8281-f4f27f8188ba\") " Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.932400 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/babef220-5cd7-4d0a-8281-f4f27f8188ba-logs" (OuterVolumeSpecName: "logs") pod "babef220-5cd7-4d0a-8281-f4f27f8188ba" (UID: "babef220-5cd7-4d0a-8281-f4f27f8188ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.941803 4693 generic.go:334] "Generic (PLEG): container finished" podID="2e368b0f-0bcb-4eff-931b-459adb726edc" containerID="ec66add43d659acb958ccbfe8c3d60e28b48989d9a3d0cff14b3f0bd843e39a8" exitCode=143 Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.941927 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e368b0f-0bcb-4eff-931b-459adb726edc","Type":"ContainerDied","Data":"ec66add43d659acb958ccbfe8c3d60e28b48989d9a3d0cff14b3f0bd843e39a8"} Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.945428 4693 generic.go:334] "Generic (PLEG): container finished" podID="babef220-5cd7-4d0a-8281-f4f27f8188ba" containerID="fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67" exitCode=0 Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.945497 4693 generic.go:334] "Generic (PLEG): container finished" podID="babef220-5cd7-4d0a-8281-f4f27f8188ba" containerID="fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6" exitCode=143 Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.946044 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9" containerName="nova-scheduler-scheduler" containerID="cri-o://48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7" gracePeriod=30 Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.946535 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.946855 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/babef220-5cd7-4d0a-8281-f4f27f8188ba-kube-api-access-pd4xb" (OuterVolumeSpecName: "kube-api-access-pd4xb") pod "babef220-5cd7-4d0a-8281-f4f27f8188ba" (UID: "babef220-5cd7-4d0a-8281-f4f27f8188ba"). InnerVolumeSpecName "kube-api-access-pd4xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.947053 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"babef220-5cd7-4d0a-8281-f4f27f8188ba","Type":"ContainerDied","Data":"fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67"} Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.947092 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"babef220-5cd7-4d0a-8281-f4f27f8188ba","Type":"ContainerDied","Data":"fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6"} Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.947105 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"babef220-5cd7-4d0a-8281-f4f27f8188ba","Type":"ContainerDied","Data":"98a341b5ead2caae61ff8688326fbaaf91d6efd8eb691992975937cde4b55f46"} Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.947144 4693 scope.go:117] "RemoveContainer" containerID="fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67" Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.969840 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "babef220-5cd7-4d0a-8281-f4f27f8188ba" (UID: "babef220-5cd7-4d0a-8281-f4f27f8188ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.970225 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-config-data" (OuterVolumeSpecName: "config-data") pod "babef220-5cd7-4d0a-8281-f4f27f8188ba" (UID: "babef220-5cd7-4d0a-8281-f4f27f8188ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:29 crc kubenswrapper[4693]: I1125 12:29:29.986787 4693 scope.go:117] "RemoveContainer" containerID="fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.014209 4693 scope.go:117] "RemoveContainer" containerID="fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67" Nov 25 12:29:30 crc kubenswrapper[4693]: E1125 12:29:30.018246 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67\": container with ID starting with fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67 not found: ID does not exist" containerID="fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.018582 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67"} err="failed to get container status \"fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67\": rpc error: code = NotFound desc = could not find container \"fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67\": container with ID starting with fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67 not found: ID does not exist" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.018617 4693 scope.go:117] "RemoveContainer" containerID="fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.021205 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "babef220-5cd7-4d0a-8281-f4f27f8188ba" (UID: "babef220-5cd7-4d0a-8281-f4f27f8188ba"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:30 crc kubenswrapper[4693]: E1125 12:29:30.022348 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6\": container with ID starting with fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6 not found: ID does not exist" containerID="fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.022402 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6"} err="failed to get container status \"fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6\": rpc error: code = NotFound desc = could not find container \"fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6\": container with ID starting with fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6 not found: ID does not exist" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.022425 4693 scope.go:117] "RemoveContainer" containerID="fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.024263 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67"} err="failed to get container status \"fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67\": rpc error: code = NotFound desc = could not find container \"fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67\": container with ID starting with fc0e39f4a1bd185719a22849aaf6775f0fdac63b3c1f84c4934ea6704d335d67 not found: ID does not exist" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.024322 4693 scope.go:117] "RemoveContainer" containerID="fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.024709 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6"} err="failed to get container status \"fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6\": rpc error: code = NotFound desc = could not find container \"fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6\": container with ID starting with fd47c84c6dffca3203647cfafaf55cc9edbc6ca10a4b1572ed89fb7b9be1d6f6 not found: ID does not exist" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.033444 4693 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.033495 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/babef220-5cd7-4d0a-8281-f4f27f8188ba-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.033508 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.033518 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/babef220-5cd7-4d0a-8281-f4f27f8188ba-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.033528 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd4xb\" (UniqueName: \"kubernetes.io/projected/babef220-5cd7-4d0a-8281-f4f27f8188ba-kube-api-access-pd4xb\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.263122 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.282693 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.296958 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.332711 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:30 crc kubenswrapper[4693]: E1125 12:29:30.333170 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505f634a-96dc-4bab-9cf6-416ca6ebf3df" containerName="nova-manage" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.333187 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="505f634a-96dc-4bab-9cf6-416ca6ebf3df" containerName="nova-manage" Nov 25 12:29:30 crc kubenswrapper[4693]: E1125 12:29:30.333205 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="babef220-5cd7-4d0a-8281-f4f27f8188ba" containerName="nova-metadata-metadata" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.333213 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="babef220-5cd7-4d0a-8281-f4f27f8188ba" containerName="nova-metadata-metadata" Nov 25 12:29:30 crc kubenswrapper[4693]: E1125 12:29:30.333231 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5da329-8105-4bd9-a340-65d273d51bbf" containerName="init" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.333240 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5da329-8105-4bd9-a340-65d273d51bbf" containerName="init" Nov 25 12:29:30 crc kubenswrapper[4693]: E1125 12:29:30.333260 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a868736f-308b-4590-966c-f4d01a5da39a" containerName="nova-cell1-conductor-db-sync" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.333269 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a868736f-308b-4590-966c-f4d01a5da39a" containerName="nova-cell1-conductor-db-sync" Nov 25 12:29:30 crc kubenswrapper[4693]: E1125 12:29:30.333286 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="babef220-5cd7-4d0a-8281-f4f27f8188ba" containerName="nova-metadata-log" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.333294 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="babef220-5cd7-4d0a-8281-f4f27f8188ba" containerName="nova-metadata-log" Nov 25 12:29:30 crc kubenswrapper[4693]: E1125 12:29:30.333321 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5da329-8105-4bd9-a340-65d273d51bbf" containerName="dnsmasq-dns" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.333330 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5da329-8105-4bd9-a340-65d273d51bbf" containerName="dnsmasq-dns" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.334069 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5da329-8105-4bd9-a340-65d273d51bbf" containerName="dnsmasq-dns" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.334091 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a868736f-308b-4590-966c-f4d01a5da39a" containerName="nova-cell1-conductor-db-sync" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.334124 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="babef220-5cd7-4d0a-8281-f4f27f8188ba" containerName="nova-metadata-metadata" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.334144 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="babef220-5cd7-4d0a-8281-f4f27f8188ba" containerName="nova-metadata-log" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.334157 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="505f634a-96dc-4bab-9cf6-416ca6ebf3df" containerName="nova-manage" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.335396 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.341517 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.344501 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.344577 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-scripts\") pod \"a868736f-308b-4590-966c-f4d01a5da39a\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.344618 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-combined-ca-bundle\") pod \"a868736f-308b-4590-966c-f4d01a5da39a\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.344767 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-config-data\") pod \"a868736f-308b-4590-966c-f4d01a5da39a\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.344831 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4d9z\" (UniqueName: \"kubernetes.io/projected/a868736f-308b-4590-966c-f4d01a5da39a-kube-api-access-d4d9z\") pod \"a868736f-308b-4590-966c-f4d01a5da39a\" (UID: \"a868736f-308b-4590-966c-f4d01a5da39a\") " Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.344898 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.350071 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a868736f-308b-4590-966c-f4d01a5da39a-kube-api-access-d4d9z" (OuterVolumeSpecName: "kube-api-access-d4d9z") pod "a868736f-308b-4590-966c-f4d01a5da39a" (UID: "a868736f-308b-4590-966c-f4d01a5da39a"). InnerVolumeSpecName "kube-api-access-d4d9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.365761 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-scripts" (OuterVolumeSpecName: "scripts") pod "a868736f-308b-4590-966c-f4d01a5da39a" (UID: "a868736f-308b-4590-966c-f4d01a5da39a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.383585 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-config-data" (OuterVolumeSpecName: "config-data") pod "a868736f-308b-4590-966c-f4d01a5da39a" (UID: "a868736f-308b-4590-966c-f4d01a5da39a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.397136 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a868736f-308b-4590-966c-f4d01a5da39a" (UID: "a868736f-308b-4590-966c-f4d01a5da39a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.454556 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhg87\" (UniqueName: \"kubernetes.io/projected/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-kube-api-access-bhg87\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.454885 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.455037 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-config-data\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.455065 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-logs\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.455099 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.455262 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.455282 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.455291 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4d9z\" (UniqueName: \"kubernetes.io/projected/a868736f-308b-4590-966c-f4d01a5da39a-kube-api-access-d4d9z\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.455302 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a868736f-308b-4590-966c-f4d01a5da39a-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.556530 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.556618 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-logs\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.556641 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-config-data\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.556667 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.556711 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhg87\" (UniqueName: \"kubernetes.io/projected/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-kube-api-access-bhg87\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.557581 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-logs\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.560802 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.560992 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.561257 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-config-data\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.573281 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhg87\" (UniqueName: \"kubernetes.io/projected/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-kube-api-access-bhg87\") pod \"nova-metadata-0\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.756076 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.756666 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="808a4dcd-a02a-4dd6-a797-5932896d3482" containerName="kube-state-metrics" containerID="cri-o://157be9351e9d0e6bca815b5ffd868d645f56c922a3229c669e3e9819e526beff" gracePeriod=30 Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.802543 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.828481 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="babef220-5cd7-4d0a-8281-f4f27f8188ba" path="/var/lib/kubelet/pods/babef220-5cd7-4d0a-8281-f4f27f8188ba/volumes" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.986920 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-z88lk" event={"ID":"a868736f-308b-4590-966c-f4d01a5da39a","Type":"ContainerDied","Data":"740d4e78719b048994ce729f91a59853992ebc9924f283ef81a851f607ddf138"} Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.986961 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="740d4e78719b048994ce729f91a59853992ebc9924f283ef81a851f607ddf138" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.987006 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-z88lk" Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.990555 4693 generic.go:334] "Generic (PLEG): container finished" podID="808a4dcd-a02a-4dd6-a797-5932896d3482" containerID="157be9351e9d0e6bca815b5ffd868d645f56c922a3229c669e3e9819e526beff" exitCode=2 Nov 25 12:29:30 crc kubenswrapper[4693]: I1125 12:29:30.990603 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"808a4dcd-a02a-4dd6-a797-5932896d3482","Type":"ContainerDied","Data":"157be9351e9d0e6bca815b5ffd868d645f56c922a3229c669e3e9819e526beff"} Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.046498 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.048364 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.051270 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.058000 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.174079 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d79906-6ec5-4483-83ef-ae2ff2674c86-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"35d79906-6ec5-4483-83ef-ae2ff2674c86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.174153 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d79906-6ec5-4483-83ef-ae2ff2674c86-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"35d79906-6ec5-4483-83ef-ae2ff2674c86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.174198 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzxvv\" (UniqueName: \"kubernetes.io/projected/35d79906-6ec5-4483-83ef-ae2ff2674c86-kube-api-access-mzxvv\") pod \"nova-cell1-conductor-0\" (UID: \"35d79906-6ec5-4483-83ef-ae2ff2674c86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.275972 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d79906-6ec5-4483-83ef-ae2ff2674c86-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"35d79906-6ec5-4483-83ef-ae2ff2674c86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.276033 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d79906-6ec5-4483-83ef-ae2ff2674c86-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"35d79906-6ec5-4483-83ef-ae2ff2674c86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.276064 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzxvv\" (UniqueName: \"kubernetes.io/projected/35d79906-6ec5-4483-83ef-ae2ff2674c86-kube-api-access-mzxvv\") pod \"nova-cell1-conductor-0\" (UID: \"35d79906-6ec5-4483-83ef-ae2ff2674c86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.281911 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35d79906-6ec5-4483-83ef-ae2ff2674c86-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"35d79906-6ec5-4483-83ef-ae2ff2674c86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.282306 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35d79906-6ec5-4483-83ef-ae2ff2674c86-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"35d79906-6ec5-4483-83ef-ae2ff2674c86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.293023 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzxvv\" (UniqueName: \"kubernetes.io/projected/35d79906-6ec5-4483-83ef-ae2ff2674c86-kube-api-access-mzxvv\") pod \"nova-cell1-conductor-0\" (UID: \"35d79906-6ec5-4483-83ef-ae2ff2674c86\") " pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.319041 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:29:31 crc kubenswrapper[4693]: W1125 12:29:31.319455 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fe8afed_c8ef_437c_99d0_ecbb1a9e1fa4.slice/crio-84c1600fd2120debb0726f9c819b83221a911beaed30c0858417e0b9ddc1725f WatchSource:0}: Error finding container 84c1600fd2120debb0726f9c819b83221a911beaed30c0858417e0b9ddc1725f: Status 404 returned error can't find the container with id 84c1600fd2120debb0726f9c819b83221a911beaed30c0858417e0b9ddc1725f Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.369006 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:31 crc kubenswrapper[4693]: E1125 12:29:31.522385 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 25 12:29:31 crc kubenswrapper[4693]: E1125 12:29:31.523806 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 25 12:29:31 crc kubenswrapper[4693]: E1125 12:29:31.525688 4693 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 25 12:29:31 crc kubenswrapper[4693]: E1125 12:29:31.525762 4693 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9" containerName="nova-scheduler-scheduler" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.703913 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.793891 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs54q\" (UniqueName: \"kubernetes.io/projected/808a4dcd-a02a-4dd6-a797-5932896d3482-kube-api-access-hs54q\") pod \"808a4dcd-a02a-4dd6-a797-5932896d3482\" (UID: \"808a4dcd-a02a-4dd6-a797-5932896d3482\") " Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.799115 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808a4dcd-a02a-4dd6-a797-5932896d3482-kube-api-access-hs54q" (OuterVolumeSpecName: "kube-api-access-hs54q") pod "808a4dcd-a02a-4dd6-a797-5932896d3482" (UID: "808a4dcd-a02a-4dd6-a797-5932896d3482"). InnerVolumeSpecName "kube-api-access-hs54q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.896563 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs54q\" (UniqueName: \"kubernetes.io/projected/808a4dcd-a02a-4dd6-a797-5932896d3482-kube-api-access-hs54q\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:31 crc kubenswrapper[4693]: W1125 12:29:31.913835 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35d79906_6ec5_4483_83ef_ae2ff2674c86.slice/crio-416449142e349e144d10c09cad4d62fafdddf102474005d7f77c7b8f5347b366 WatchSource:0}: Error finding container 416449142e349e144d10c09cad4d62fafdddf102474005d7f77c7b8f5347b366: Status 404 returned error can't find the container with id 416449142e349e144d10c09cad4d62fafdddf102474005d7f77c7b8f5347b366 Nov 25 12:29:31 crc kubenswrapper[4693]: I1125 12:29:31.916429 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.002813 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"808a4dcd-a02a-4dd6-a797-5932896d3482","Type":"ContainerDied","Data":"7057a85b4c453f3eedae73f4af96f256b95190f827144ae668100f2125a3ee87"} Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.002871 4693 scope.go:117] "RemoveContainer" containerID="157be9351e9d0e6bca815b5ffd868d645f56c922a3229c669e3e9819e526beff" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.002994 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.019748 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4","Type":"ContainerStarted","Data":"84c1600fd2120debb0726f9c819b83221a911beaed30c0858417e0b9ddc1725f"} Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.019824 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"35d79906-6ec5-4483-83ef-ae2ff2674c86","Type":"ContainerStarted","Data":"416449142e349e144d10c09cad4d62fafdddf102474005d7f77c7b8f5347b366"} Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.066383 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.077949 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.095054 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 12:29:32 crc kubenswrapper[4693]: E1125 12:29:32.095462 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808a4dcd-a02a-4dd6-a797-5932896d3482" containerName="kube-state-metrics" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.095485 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="808a4dcd-a02a-4dd6-a797-5932896d3482" containerName="kube-state-metrics" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.095648 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="808a4dcd-a02a-4dd6-a797-5932896d3482" containerName="kube-state-metrics" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.096795 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.097843 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.113313 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.113552 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.215676 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5b4281-3cdb-4bad-8002-8520136232a4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ee5b4281-3cdb-4bad-8002-8520136232a4\") " pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.216147 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5b4281-3cdb-4bad-8002-8520136232a4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ee5b4281-3cdb-4bad-8002-8520136232a4\") " pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.216280 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ee5b4281-3cdb-4bad-8002-8520136232a4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ee5b4281-3cdb-4bad-8002-8520136232a4\") " pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.216450 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwskm\" (UniqueName: \"kubernetes.io/projected/ee5b4281-3cdb-4bad-8002-8520136232a4-kube-api-access-xwskm\") pod \"kube-state-metrics-0\" (UID: \"ee5b4281-3cdb-4bad-8002-8520136232a4\") " pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.318579 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5b4281-3cdb-4bad-8002-8520136232a4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ee5b4281-3cdb-4bad-8002-8520136232a4\") " pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.318671 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5b4281-3cdb-4bad-8002-8520136232a4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ee5b4281-3cdb-4bad-8002-8520136232a4\") " pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.318719 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ee5b4281-3cdb-4bad-8002-8520136232a4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ee5b4281-3cdb-4bad-8002-8520136232a4\") " pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.318793 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwskm\" (UniqueName: \"kubernetes.io/projected/ee5b4281-3cdb-4bad-8002-8520136232a4-kube-api-access-xwskm\") pod \"kube-state-metrics-0\" (UID: \"ee5b4281-3cdb-4bad-8002-8520136232a4\") " pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.327205 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5b4281-3cdb-4bad-8002-8520136232a4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ee5b4281-3cdb-4bad-8002-8520136232a4\") " pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.327216 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5b4281-3cdb-4bad-8002-8520136232a4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ee5b4281-3cdb-4bad-8002-8520136232a4\") " pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.333449 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ee5b4281-3cdb-4bad-8002-8520136232a4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ee5b4281-3cdb-4bad-8002-8520136232a4\") " pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.344388 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwskm\" (UniqueName: \"kubernetes.io/projected/ee5b4281-3cdb-4bad-8002-8520136232a4-kube-api-access-xwskm\") pod \"kube-state-metrics-0\" (UID: \"ee5b4281-3cdb-4bad-8002-8520136232a4\") " pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.440611 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.688568 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.689667 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="sg-core" containerID="cri-o://ffa259d9c0f30021eddc950d93a0f9865116860b783849e88c75a3be29c8abf6" gracePeriod=30 Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.689622 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="ceilometer-central-agent" containerID="cri-o://87c41bb256d68f6bf3bf4d5a50aed7fda933bef5e360aa6a807460997d464589" gracePeriod=30 Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.689793 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="ceilometer-notification-agent" containerID="cri-o://e9ca68dbc65fb897e09f9f5d8f854603d26c4aa7d019cf824392742f77a0b365" gracePeriod=30 Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.689816 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="proxy-httpd" containerID="cri-o://fa721a39c46572d7fa37e6fac24e52af88798f03bb33bcdcdb09c2ceaf5690a2" gracePeriod=30 Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.846865 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="808a4dcd-a02a-4dd6-a797-5932896d3482" path="/var/lib/kubelet/pods/808a4dcd-a02a-4dd6-a797-5932896d3482/volumes" Nov 25 12:29:32 crc kubenswrapper[4693]: I1125 12:29:32.925864 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 25 12:29:33 crc kubenswrapper[4693]: I1125 12:29:33.022010 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee5b4281-3cdb-4bad-8002-8520136232a4","Type":"ContainerStarted","Data":"89dbad29d2ad34a2788127317fb6949f0b03cea55c9a614a8f6e01ca42458828"} Nov 25 12:29:33 crc kubenswrapper[4693]: I1125 12:29:33.025787 4693 generic.go:334] "Generic (PLEG): container finished" podID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerID="fa721a39c46572d7fa37e6fac24e52af88798f03bb33bcdcdb09c2ceaf5690a2" exitCode=0 Nov 25 12:29:33 crc kubenswrapper[4693]: I1125 12:29:33.025825 4693 generic.go:334] "Generic (PLEG): container finished" podID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerID="ffa259d9c0f30021eddc950d93a0f9865116860b783849e88c75a3be29c8abf6" exitCode=2 Nov 25 12:29:33 crc kubenswrapper[4693]: I1125 12:29:33.025867 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"100b892d-b171-4ed1-a355-fc4e59d989a0","Type":"ContainerDied","Data":"fa721a39c46572d7fa37e6fac24e52af88798f03bb33bcdcdb09c2ceaf5690a2"} Nov 25 12:29:33 crc kubenswrapper[4693]: I1125 12:29:33.025891 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"100b892d-b171-4ed1-a355-fc4e59d989a0","Type":"ContainerDied","Data":"ffa259d9c0f30021eddc950d93a0f9865116860b783849e88c75a3be29c8abf6"} Nov 25 12:29:33 crc kubenswrapper[4693]: I1125 12:29:33.030592 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4","Type":"ContainerStarted","Data":"10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e"} Nov 25 12:29:33 crc kubenswrapper[4693]: I1125 12:29:33.033354 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"35d79906-6ec5-4483-83ef-ae2ff2674c86","Type":"ContainerStarted","Data":"0f9a3532e27d7ab0f759155e34e8f290ea1ff535b5e20328bcb724ebf3dae9dd"} Nov 25 12:29:33 crc kubenswrapper[4693]: I1125 12:29:33.034406 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:33 crc kubenswrapper[4693]: I1125 12:29:33.053794 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.053772351 podStartE2EDuration="2.053772351s" podCreationTimestamp="2025-11-25 12:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:29:33.048145062 +0000 UTC m=+1292.966230443" watchObservedRunningTime="2025-11-25 12:29:33.053772351 +0000 UTC m=+1292.971857742" Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.049906 4693 generic.go:334] "Generic (PLEG): container finished" podID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerID="87c41bb256d68f6bf3bf4d5a50aed7fda933bef5e360aa6a807460997d464589" exitCode=0 Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.050175 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"100b892d-b171-4ed1-a355-fc4e59d989a0","Type":"ContainerDied","Data":"87c41bb256d68f6bf3bf4d5a50aed7fda933bef5e360aa6a807460997d464589"} Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.054695 4693 generic.go:334] "Generic (PLEG): container finished" podID="2e368b0f-0bcb-4eff-931b-459adb726edc" containerID="68292962c3ed0592fe38c82e87d936b6aced0415ec8f9c18368e1917dafc4ee4" exitCode=0 Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.054774 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e368b0f-0bcb-4eff-931b-459adb726edc","Type":"ContainerDied","Data":"68292962c3ed0592fe38c82e87d936b6aced0415ec8f9c18368e1917dafc4ee4"} Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.057696 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4","Type":"ContainerStarted","Data":"c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0"} Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.093539 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.093514131 podStartE2EDuration="4.093514131s" podCreationTimestamp="2025-11-25 12:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:29:34.073853884 +0000 UTC m=+1293.991939265" watchObservedRunningTime="2025-11-25 12:29:34.093514131 +0000 UTC m=+1294.011599512" Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.393262 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.458233 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e368b0f-0bcb-4eff-931b-459adb726edc-config-data\") pod \"2e368b0f-0bcb-4eff-931b-459adb726edc\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.458407 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e368b0f-0bcb-4eff-931b-459adb726edc-combined-ca-bundle\") pod \"2e368b0f-0bcb-4eff-931b-459adb726edc\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.458692 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e368b0f-0bcb-4eff-931b-459adb726edc-logs\") pod \"2e368b0f-0bcb-4eff-931b-459adb726edc\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.458798 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tssgt\" (UniqueName: \"kubernetes.io/projected/2e368b0f-0bcb-4eff-931b-459adb726edc-kube-api-access-tssgt\") pod \"2e368b0f-0bcb-4eff-931b-459adb726edc\" (UID: \"2e368b0f-0bcb-4eff-931b-459adb726edc\") " Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.459482 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e368b0f-0bcb-4eff-931b-459adb726edc-logs" (OuterVolumeSpecName: "logs") pod "2e368b0f-0bcb-4eff-931b-459adb726edc" (UID: "2e368b0f-0bcb-4eff-931b-459adb726edc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.461085 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e368b0f-0bcb-4eff-931b-459adb726edc-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.468999 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e368b0f-0bcb-4eff-931b-459adb726edc-kube-api-access-tssgt" (OuterVolumeSpecName: "kube-api-access-tssgt") pod "2e368b0f-0bcb-4eff-931b-459adb726edc" (UID: "2e368b0f-0bcb-4eff-931b-459adb726edc"). InnerVolumeSpecName "kube-api-access-tssgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.488744 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e368b0f-0bcb-4eff-931b-459adb726edc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e368b0f-0bcb-4eff-931b-459adb726edc" (UID: "2e368b0f-0bcb-4eff-931b-459adb726edc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.497261 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e368b0f-0bcb-4eff-931b-459adb726edc-config-data" (OuterVolumeSpecName: "config-data") pod "2e368b0f-0bcb-4eff-931b-459adb726edc" (UID: "2e368b0f-0bcb-4eff-931b-459adb726edc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.562326 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e368b0f-0bcb-4eff-931b-459adb726edc-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.562359 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e368b0f-0bcb-4eff-931b-459adb726edc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.562393 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tssgt\" (UniqueName: \"kubernetes.io/projected/2e368b0f-0bcb-4eff-931b-459adb726edc-kube-api-access-tssgt\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.881248 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.974390 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-combined-ca-bundle\") pod \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\" (UID: \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\") " Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.974504 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qxd7\" (UniqueName: \"kubernetes.io/projected/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-kube-api-access-5qxd7\") pod \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\" (UID: \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\") " Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.974615 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-config-data\") pod \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\" (UID: \"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9\") " Nov 25 12:29:34 crc kubenswrapper[4693]: I1125 12:29:34.985945 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-kube-api-access-5qxd7" (OuterVolumeSpecName: "kube-api-access-5qxd7") pod "c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9" (UID: "c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9"). InnerVolumeSpecName "kube-api-access-5qxd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.016067 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-config-data" (OuterVolumeSpecName: "config-data") pod "c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9" (UID: "c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.019581 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9" (UID: "c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.075689 4693 generic.go:334] "Generic (PLEG): container finished" podID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerID="e9ca68dbc65fb897e09f9f5d8f854603d26c4aa7d019cf824392742f77a0b365" exitCode=0 Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.075789 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"100b892d-b171-4ed1-a355-fc4e59d989a0","Type":"ContainerDied","Data":"e9ca68dbc65fb897e09f9f5d8f854603d26c4aa7d019cf824392742f77a0b365"} Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.076858 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qxd7\" (UniqueName: \"kubernetes.io/projected/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-kube-api-access-5qxd7\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.076881 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.076892 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.081393 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2e368b0f-0bcb-4eff-931b-459adb726edc","Type":"ContainerDied","Data":"530e28b46cfafbfe1f23242236d88dd3b1e5810318c33d003043fa807d0b2f6e"} Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.081404 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.081441 4693 scope.go:117] "RemoveContainer" containerID="68292962c3ed0592fe38c82e87d936b6aced0415ec8f9c18368e1917dafc4ee4" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.086780 4693 generic.go:334] "Generic (PLEG): container finished" podID="c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9" containerID="48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7" exitCode=0 Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.086881 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9","Type":"ContainerDied","Data":"48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7"} Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.086917 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9","Type":"ContainerDied","Data":"b6fef8232f276954c95579a4ccd4f2a39f0e3c3711005711a601d3351db40493"} Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.086982 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.090847 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee5b4281-3cdb-4bad-8002-8520136232a4","Type":"ContainerStarted","Data":"d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425"} Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.104821 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.114348 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.124846 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 12:29:35 crc kubenswrapper[4693]: E1125 12:29:35.125280 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9" containerName="nova-scheduler-scheduler" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.125298 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9" containerName="nova-scheduler-scheduler" Nov 25 12:29:35 crc kubenswrapper[4693]: E1125 12:29:35.125322 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e368b0f-0bcb-4eff-931b-459adb726edc" containerName="nova-api-log" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.125331 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e368b0f-0bcb-4eff-931b-459adb726edc" containerName="nova-api-log" Nov 25 12:29:35 crc kubenswrapper[4693]: E1125 12:29:35.125359 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e368b0f-0bcb-4eff-931b-459adb726edc" containerName="nova-api-api" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.125366 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e368b0f-0bcb-4eff-931b-459adb726edc" containerName="nova-api-api" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.125579 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9" containerName="nova-scheduler-scheduler" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.125605 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e368b0f-0bcb-4eff-931b-459adb726edc" containerName="nova-api-api" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.125617 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e368b0f-0bcb-4eff-931b-459adb726edc" containerName="nova-api-log" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.126669 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.128435 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.137763 4693 scope.go:117] "RemoveContainer" containerID="ec66add43d659acb958ccbfe8c3d60e28b48989d9a3d0cff14b3f0bd843e39a8" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.138059 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.190579 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.191499 4693 scope.go:117] "RemoveContainer" containerID="48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.200187 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.203061 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.215493 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:29:35 crc kubenswrapper[4693]: E1125 12:29:35.216513 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="sg-core" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.216536 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="sg-core" Nov 25 12:29:35 crc kubenswrapper[4693]: E1125 12:29:35.216558 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="proxy-httpd" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.216566 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="proxy-httpd" Nov 25 12:29:35 crc kubenswrapper[4693]: E1125 12:29:35.216587 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="ceilometer-central-agent" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.216596 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="ceilometer-central-agent" Nov 25 12:29:35 crc kubenswrapper[4693]: E1125 12:29:35.216619 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="ceilometer-notification-agent" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.216625 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="ceilometer-notification-agent" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.216965 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="ceilometer-central-agent" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.216997 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="ceilometer-notification-agent" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.217012 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="proxy-httpd" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.217029 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" containerName="sg-core" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.218335 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.224961 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.228651 4693 scope.go:117] "RemoveContainer" containerID="48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7" Nov 25 12:29:35 crc kubenswrapper[4693]: E1125 12:29:35.234637 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7\": container with ID starting with 48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7 not found: ID does not exist" containerID="48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.234688 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7"} err="failed to get container status \"48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7\": rpc error: code = NotFound desc = could not find container \"48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7\": container with ID starting with 48e22c0230bc83a72158a3f0f47b1fa9f98898965a9e59d68e97be34df8180b7 not found: ID does not exist" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.269583 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.295365 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/100b892d-b171-4ed1-a355-fc4e59d989a0-run-httpd\") pod \"100b892d-b171-4ed1-a355-fc4e59d989a0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.295739 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-scripts\") pod \"100b892d-b171-4ed1-a355-fc4e59d989a0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.295736 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/100b892d-b171-4ed1-a355-fc4e59d989a0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "100b892d-b171-4ed1-a355-fc4e59d989a0" (UID: "100b892d-b171-4ed1-a355-fc4e59d989a0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.295776 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-config-data\") pod \"100b892d-b171-4ed1-a355-fc4e59d989a0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.295833 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-sg-core-conf-yaml\") pod \"100b892d-b171-4ed1-a355-fc4e59d989a0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.295878 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5qs2\" (UniqueName: \"kubernetes.io/projected/100b892d-b171-4ed1-a355-fc4e59d989a0-kube-api-access-f5qs2\") pod \"100b892d-b171-4ed1-a355-fc4e59d989a0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.296003 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-combined-ca-bundle\") pod \"100b892d-b171-4ed1-a355-fc4e59d989a0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.296031 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/100b892d-b171-4ed1-a355-fc4e59d989a0-log-httpd\") pod \"100b892d-b171-4ed1-a355-fc4e59d989a0\" (UID: \"100b892d-b171-4ed1-a355-fc4e59d989a0\") " Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.296245 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx44q\" (UniqueName: \"kubernetes.io/projected/721e10a7-c21e-449c-8186-aa83c1d7f97d-kube-api-access-bx44q\") pod \"nova-scheduler-0\" (UID: \"721e10a7-c21e-449c-8186-aa83c1d7f97d\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.296286 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721e10a7-c21e-449c-8186-aa83c1d7f97d-config-data\") pod \"nova-scheduler-0\" (UID: \"721e10a7-c21e-449c-8186-aa83c1d7f97d\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.296308 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0f1b87-9948-4fc0-b273-0aba59284c59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.296355 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpsbf\" (UniqueName: \"kubernetes.io/projected/fb0f1b87-9948-4fc0-b273-0aba59284c59-kube-api-access-vpsbf\") pod \"nova-api-0\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.296425 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0f1b87-9948-4fc0-b273-0aba59284c59-config-data\") pod \"nova-api-0\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.296444 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721e10a7-c21e-449c-8186-aa83c1d7f97d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"721e10a7-c21e-449c-8186-aa83c1d7f97d\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.296485 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb0f1b87-9948-4fc0-b273-0aba59284c59-logs\") pod \"nova-api-0\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.296934 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/100b892d-b171-4ed1-a355-fc4e59d989a0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "100b892d-b171-4ed1-a355-fc4e59d989a0" (UID: "100b892d-b171-4ed1-a355-fc4e59d989a0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.296955 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/100b892d-b171-4ed1-a355-fc4e59d989a0-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.301177 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-scripts" (OuterVolumeSpecName: "scripts") pod "100b892d-b171-4ed1-a355-fc4e59d989a0" (UID: "100b892d-b171-4ed1-a355-fc4e59d989a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.309980 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/100b892d-b171-4ed1-a355-fc4e59d989a0-kube-api-access-f5qs2" (OuterVolumeSpecName: "kube-api-access-f5qs2") pod "100b892d-b171-4ed1-a355-fc4e59d989a0" (UID: "100b892d-b171-4ed1-a355-fc4e59d989a0"). InnerVolumeSpecName "kube-api-access-f5qs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.354641 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "100b892d-b171-4ed1-a355-fc4e59d989a0" (UID: "100b892d-b171-4ed1-a355-fc4e59d989a0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.399018 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpsbf\" (UniqueName: \"kubernetes.io/projected/fb0f1b87-9948-4fc0-b273-0aba59284c59-kube-api-access-vpsbf\") pod \"nova-api-0\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.399125 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0f1b87-9948-4fc0-b273-0aba59284c59-config-data\") pod \"nova-api-0\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.399153 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721e10a7-c21e-449c-8186-aa83c1d7f97d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"721e10a7-c21e-449c-8186-aa83c1d7f97d\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.399192 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb0f1b87-9948-4fc0-b273-0aba59284c59-logs\") pod \"nova-api-0\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.399245 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx44q\" (UniqueName: \"kubernetes.io/projected/721e10a7-c21e-449c-8186-aa83c1d7f97d-kube-api-access-bx44q\") pod \"nova-scheduler-0\" (UID: \"721e10a7-c21e-449c-8186-aa83c1d7f97d\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.399279 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721e10a7-c21e-449c-8186-aa83c1d7f97d-config-data\") pod \"nova-scheduler-0\" (UID: \"721e10a7-c21e-449c-8186-aa83c1d7f97d\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.399304 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0f1b87-9948-4fc0-b273-0aba59284c59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.399389 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.399403 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5qs2\" (UniqueName: \"kubernetes.io/projected/100b892d-b171-4ed1-a355-fc4e59d989a0-kube-api-access-f5qs2\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.399416 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/100b892d-b171-4ed1-a355-fc4e59d989a0-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.399425 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.400257 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb0f1b87-9948-4fc0-b273-0aba59284c59-logs\") pod \"nova-api-0\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.402192 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "100b892d-b171-4ed1-a355-fc4e59d989a0" (UID: "100b892d-b171-4ed1-a355-fc4e59d989a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.405883 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721e10a7-c21e-449c-8186-aa83c1d7f97d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"721e10a7-c21e-449c-8186-aa83c1d7f97d\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.406183 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0f1b87-9948-4fc0-b273-0aba59284c59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.415415 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0f1b87-9948-4fc0-b273-0aba59284c59-config-data\") pod \"nova-api-0\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.417838 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721e10a7-c21e-449c-8186-aa83c1d7f97d-config-data\") pod \"nova-scheduler-0\" (UID: \"721e10a7-c21e-449c-8186-aa83c1d7f97d\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.419317 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx44q\" (UniqueName: \"kubernetes.io/projected/721e10a7-c21e-449c-8186-aa83c1d7f97d-kube-api-access-bx44q\") pod \"nova-scheduler-0\" (UID: \"721e10a7-c21e-449c-8186-aa83c1d7f97d\") " pod="openstack/nova-scheduler-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.419806 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpsbf\" (UniqueName: \"kubernetes.io/projected/fb0f1b87-9948-4fc0-b273-0aba59284c59-kube-api-access-vpsbf\") pod \"nova-api-0\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.429363 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-config-data" (OuterVolumeSpecName: "config-data") pod "100b892d-b171-4ed1-a355-fc4e59d989a0" (UID: "100b892d-b171-4ed1-a355-fc4e59d989a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.447855 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.501824 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.501867 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/100b892d-b171-4ed1-a355-fc4e59d989a0-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.546289 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.802721 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.803045 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 12:29:35 crc kubenswrapper[4693]: I1125 12:29:35.960631 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.082735 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:29:36 crc kubenswrapper[4693]: W1125 12:29:36.088349 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod721e10a7_c21e_449c_8186_aa83c1d7f97d.slice/crio-ed782d2a70dacf1f9424b80b322394f74869696586c927c95dc2654d4be801af WatchSource:0}: Error finding container ed782d2a70dacf1f9424b80b322394f74869696586c927c95dc2654d4be801af: Status 404 returned error can't find the container with id ed782d2a70dacf1f9424b80b322394f74869696586c927c95dc2654d4be801af Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.102025 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"721e10a7-c21e-449c-8186-aa83c1d7f97d","Type":"ContainerStarted","Data":"ed782d2a70dacf1f9424b80b322394f74869696586c927c95dc2654d4be801af"} Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.106644 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.106634 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"100b892d-b171-4ed1-a355-fc4e59d989a0","Type":"ContainerDied","Data":"4b5a0e23ba86e17d7cb145dec0b016c117af3700094f548ffa241622dc6cd929"} Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.106732 4693 scope.go:117] "RemoveContainer" containerID="fa721a39c46572d7fa37e6fac24e52af88798f03bb33bcdcdb09c2ceaf5690a2" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.111193 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb0f1b87-9948-4fc0-b273-0aba59284c59","Type":"ContainerStarted","Data":"779a03248c1ff094e8381b79d9355a13cda47107d828408f69deb15af0294624"} Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.115278 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.134300 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.315924867 podStartE2EDuration="4.13427979s" podCreationTimestamp="2025-11-25 12:29:32 +0000 UTC" firstStartedPulling="2025-11-25 12:29:32.932532885 +0000 UTC m=+1292.850618266" lastFinishedPulling="2025-11-25 12:29:34.750887798 +0000 UTC m=+1294.668973189" observedRunningTime="2025-11-25 12:29:36.130448093 +0000 UTC m=+1296.048533474" watchObservedRunningTime="2025-11-25 12:29:36.13427979 +0000 UTC m=+1296.052365171" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.154318 4693 scope.go:117] "RemoveContainer" containerID="ffa259d9c0f30021eddc950d93a0f9865116860b783849e88c75a3be29c8abf6" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.154982 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.168572 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.190122 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.192781 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.200431 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.201812 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.204173 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.206710 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.228205 4693 scope.go:117] "RemoveContainer" containerID="e9ca68dbc65fb897e09f9f5d8f854603d26c4aa7d019cf824392742f77a0b365" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.252155 4693 scope.go:117] "RemoveContainer" containerID="87c41bb256d68f6bf3bf4d5a50aed7fda933bef5e360aa6a807460997d464589" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.317760 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-scripts\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.317831 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvd5t\" (UniqueName: \"kubernetes.io/projected/5ebe2ec7-b555-44a1-9865-51ac733d033d-kube-api-access-wvd5t\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.317855 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.317876 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-config-data\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.317906 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ebe2ec7-b555-44a1-9865-51ac733d033d-run-httpd\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.317953 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ebe2ec7-b555-44a1-9865-51ac733d033d-log-httpd\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.318174 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.318289 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.420476 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvd5t\" (UniqueName: \"kubernetes.io/projected/5ebe2ec7-b555-44a1-9865-51ac733d033d-kube-api-access-wvd5t\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.420519 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.420544 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-config-data\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.420572 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ebe2ec7-b555-44a1-9865-51ac733d033d-run-httpd\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.420598 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ebe2ec7-b555-44a1-9865-51ac733d033d-log-httpd\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.420661 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.420689 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.420715 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-scripts\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.421079 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ebe2ec7-b555-44a1-9865-51ac733d033d-log-httpd\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.421354 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ebe2ec7-b555-44a1-9865-51ac733d033d-run-httpd\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.424905 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.424946 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-scripts\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.426653 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.427996 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-config-data\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.428429 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.437278 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvd5t\" (UniqueName: \"kubernetes.io/projected/5ebe2ec7-b555-44a1-9865-51ac733d033d-kube-api-access-wvd5t\") pod \"ceilometer-0\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.529722 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.825245 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="100b892d-b171-4ed1-a355-fc4e59d989a0" path="/var/lib/kubelet/pods/100b892d-b171-4ed1-a355-fc4e59d989a0/volumes" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.826451 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e368b0f-0bcb-4eff-931b-459adb726edc" path="/var/lib/kubelet/pods/2e368b0f-0bcb-4eff-931b-459adb726edc/volumes" Nov 25 12:29:36 crc kubenswrapper[4693]: I1125 12:29:36.827099 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9" path="/var/lib/kubelet/pods/c7fa6d95-c560-46bb-b8f7-c63c13c3b2e9/volumes" Nov 25 12:29:37 crc kubenswrapper[4693]: I1125 12:29:37.002454 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:29:37 crc kubenswrapper[4693]: W1125 12:29:37.004234 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ebe2ec7_b555_44a1_9865_51ac733d033d.slice/crio-64e9ed35781dd1ac2e38fe728c40a2e5698fa6d95c274824f8943dcc3fbdea6d WatchSource:0}: Error finding container 64e9ed35781dd1ac2e38fe728c40a2e5698fa6d95c274824f8943dcc3fbdea6d: Status 404 returned error can't find the container with id 64e9ed35781dd1ac2e38fe728c40a2e5698fa6d95c274824f8943dcc3fbdea6d Nov 25 12:29:37 crc kubenswrapper[4693]: I1125 12:29:37.125199 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb0f1b87-9948-4fc0-b273-0aba59284c59","Type":"ContainerStarted","Data":"1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142"} Nov 25 12:29:37 crc kubenswrapper[4693]: I1125 12:29:37.125248 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb0f1b87-9948-4fc0-b273-0aba59284c59","Type":"ContainerStarted","Data":"f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e"} Nov 25 12:29:37 crc kubenswrapper[4693]: I1125 12:29:37.127392 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"721e10a7-c21e-449c-8186-aa83c1d7f97d","Type":"ContainerStarted","Data":"a75608c79f42b61590a27290652e89b196514588a4e403c6cb702feab8766c79"} Nov 25 12:29:37 crc kubenswrapper[4693]: I1125 12:29:37.129252 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ebe2ec7-b555-44a1-9865-51ac733d033d","Type":"ContainerStarted","Data":"64e9ed35781dd1ac2e38fe728c40a2e5698fa6d95c274824f8943dcc3fbdea6d"} Nov 25 12:29:37 crc kubenswrapper[4693]: I1125 12:29:37.149591 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.149573451 podStartE2EDuration="2.149573451s" podCreationTimestamp="2025-11-25 12:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:29:37.140726424 +0000 UTC m=+1297.058811805" watchObservedRunningTime="2025-11-25 12:29:37.149573451 +0000 UTC m=+1297.067658832" Nov 25 12:29:37 crc kubenswrapper[4693]: I1125 12:29:37.170166 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.170150474 podStartE2EDuration="2.170150474s" podCreationTimestamp="2025-11-25 12:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:29:37.159613111 +0000 UTC m=+1297.077698512" watchObservedRunningTime="2025-11-25 12:29:37.170150474 +0000 UTC m=+1297.088235855" Nov 25 12:29:39 crc kubenswrapper[4693]: I1125 12:29:39.160287 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ebe2ec7-b555-44a1-9865-51ac733d033d","Type":"ContainerStarted","Data":"756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62"} Nov 25 12:29:40 crc kubenswrapper[4693]: I1125 12:29:40.172524 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ebe2ec7-b555-44a1-9865-51ac733d033d","Type":"ContainerStarted","Data":"6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32"} Nov 25 12:29:40 crc kubenswrapper[4693]: I1125 12:29:40.547397 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 25 12:29:40 crc kubenswrapper[4693]: I1125 12:29:40.802705 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 12:29:40 crc kubenswrapper[4693]: I1125 12:29:40.802840 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 12:29:41 crc kubenswrapper[4693]: I1125 12:29:41.182955 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ebe2ec7-b555-44a1-9865-51ac733d033d","Type":"ContainerStarted","Data":"b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd"} Nov 25 12:29:41 crc kubenswrapper[4693]: I1125 12:29:41.406528 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 25 12:29:41 crc kubenswrapper[4693]: I1125 12:29:41.817562 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 12:29:41 crc kubenswrapper[4693]: I1125 12:29:41.817562 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 12:29:42 crc kubenswrapper[4693]: I1125 12:29:42.488508 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 25 12:29:43 crc kubenswrapper[4693]: I1125 12:29:43.207170 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ebe2ec7-b555-44a1-9865-51ac733d033d","Type":"ContainerStarted","Data":"1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083"} Nov 25 12:29:43 crc kubenswrapper[4693]: I1125 12:29:43.207390 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 12:29:43 crc kubenswrapper[4693]: I1125 12:29:43.232949 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.4962723470000001 podStartE2EDuration="7.232899603s" podCreationTimestamp="2025-11-25 12:29:36 +0000 UTC" firstStartedPulling="2025-11-25 12:29:37.006401781 +0000 UTC m=+1296.924487162" lastFinishedPulling="2025-11-25 12:29:42.743029037 +0000 UTC m=+1302.661114418" observedRunningTime="2025-11-25 12:29:43.231114962 +0000 UTC m=+1303.149200373" watchObservedRunningTime="2025-11-25 12:29:43.232899603 +0000 UTC m=+1303.150984994" Nov 25 12:29:45 crc kubenswrapper[4693]: I1125 12:29:45.449293 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 12:29:45 crc kubenswrapper[4693]: I1125 12:29:45.449652 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 12:29:45 crc kubenswrapper[4693]: I1125 12:29:45.547662 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 25 12:29:45 crc kubenswrapper[4693]: I1125 12:29:45.577641 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 25 12:29:46 crc kubenswrapper[4693]: I1125 12:29:46.286116 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 25 12:29:46 crc kubenswrapper[4693]: I1125 12:29:46.530628 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fb0f1b87-9948-4fc0-b273-0aba59284c59" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 12:29:46 crc kubenswrapper[4693]: I1125 12:29:46.530628 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="fb0f1b87-9948-4fc0-b273-0aba59284c59" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 25 12:29:50 crc kubenswrapper[4693]: I1125 12:29:50.808224 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 12:29:50 crc kubenswrapper[4693]: I1125 12:29:50.808687 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 12:29:50 crc kubenswrapper[4693]: I1125 12:29:50.839598 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 12:29:50 crc kubenswrapper[4693]: I1125 12:29:50.839662 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.215786 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.316548 4693 generic.go:334] "Generic (PLEG): container finished" podID="ec29c9ae-a20a-4d71-abb9-100e510aed1b" containerID="1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042" exitCode=137 Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.316608 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ec29c9ae-a20a-4d71-abb9-100e510aed1b","Type":"ContainerDied","Data":"1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042"} Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.316626 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.316663 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ec29c9ae-a20a-4d71-abb9-100e510aed1b","Type":"ContainerDied","Data":"664960b2a4b16deb27a3af4b6e37c71a98af73f66fbc9a579c7cfe8a18b261c0"} Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.316687 4693 scope.go:117] "RemoveContainer" containerID="1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.351848 4693 scope.go:117] "RemoveContainer" containerID="1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042" Nov 25 12:29:52 crc kubenswrapper[4693]: E1125 12:29:52.352482 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042\": container with ID starting with 1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042 not found: ID does not exist" containerID="1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.352537 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042"} err="failed to get container status \"1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042\": rpc error: code = NotFound desc = could not find container \"1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042\": container with ID starting with 1ff4c1d1c419f2a389617072cedf12352ea83d3a256f8b209147ccefbadf8042 not found: ID does not exist" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.361428 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqwgd\" (UniqueName: \"kubernetes.io/projected/ec29c9ae-a20a-4d71-abb9-100e510aed1b-kube-api-access-vqwgd\") pod \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\" (UID: \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\") " Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.361519 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec29c9ae-a20a-4d71-abb9-100e510aed1b-config-data\") pod \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\" (UID: \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\") " Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.361971 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec29c9ae-a20a-4d71-abb9-100e510aed1b-combined-ca-bundle\") pod \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\" (UID: \"ec29c9ae-a20a-4d71-abb9-100e510aed1b\") " Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.368500 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec29c9ae-a20a-4d71-abb9-100e510aed1b-kube-api-access-vqwgd" (OuterVolumeSpecName: "kube-api-access-vqwgd") pod "ec29c9ae-a20a-4d71-abb9-100e510aed1b" (UID: "ec29c9ae-a20a-4d71-abb9-100e510aed1b"). InnerVolumeSpecName "kube-api-access-vqwgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.391566 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec29c9ae-a20a-4d71-abb9-100e510aed1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec29c9ae-a20a-4d71-abb9-100e510aed1b" (UID: "ec29c9ae-a20a-4d71-abb9-100e510aed1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.395278 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec29c9ae-a20a-4d71-abb9-100e510aed1b-config-data" (OuterVolumeSpecName: "config-data") pod "ec29c9ae-a20a-4d71-abb9-100e510aed1b" (UID: "ec29c9ae-a20a-4d71-abb9-100e510aed1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.464744 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec29c9ae-a20a-4d71-abb9-100e510aed1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.465070 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqwgd\" (UniqueName: \"kubernetes.io/projected/ec29c9ae-a20a-4d71-abb9-100e510aed1b-kube-api-access-vqwgd\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.465276 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec29c9ae-a20a-4d71-abb9-100e510aed1b-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.650229 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.657484 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.671911 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 12:29:52 crc kubenswrapper[4693]: E1125 12:29:52.672321 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec29c9ae-a20a-4d71-abb9-100e510aed1b" containerName="nova-cell1-novncproxy-novncproxy" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.672340 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec29c9ae-a20a-4d71-abb9-100e510aed1b" containerName="nova-cell1-novncproxy-novncproxy" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.672573 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec29c9ae-a20a-4d71-abb9-100e510aed1b" containerName="nova-cell1-novncproxy-novncproxy" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.673231 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.675411 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.675978 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.676365 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.687899 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.770585 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zv5j\" (UniqueName: \"kubernetes.io/projected/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-kube-api-access-2zv5j\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.770879 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.771044 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.771137 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.771270 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.844513 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec29c9ae-a20a-4d71-abb9-100e510aed1b" path="/var/lib/kubelet/pods/ec29c9ae-a20a-4d71-abb9-100e510aed1b/volumes" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.872762 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.872854 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.872874 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.872922 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.873558 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zv5j\" (UniqueName: \"kubernetes.io/projected/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-kube-api-access-2zv5j\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.877568 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.878876 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.882349 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.882792 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.889454 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zv5j\" (UniqueName: \"kubernetes.io/projected/f2e39ed4-ac1c-4961-80a1-24b93bed8f4b-kube-api-access-2zv5j\") pod \"nova-cell1-novncproxy-0\" (UID: \"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b\") " pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:52 crc kubenswrapper[4693]: I1125 12:29:52.988823 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:53 crc kubenswrapper[4693]: I1125 12:29:53.425854 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 25 12:29:54 crc kubenswrapper[4693]: I1125 12:29:54.341118 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b","Type":"ContainerStarted","Data":"b57373e4805dbbe267c8182d5b3588f863c3c025a5159a9eaab697b6853d3692"} Nov 25 12:29:54 crc kubenswrapper[4693]: I1125 12:29:54.341454 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f2e39ed4-ac1c-4961-80a1-24b93bed8f4b","Type":"ContainerStarted","Data":"81ff3d417a02e53ec55f9be09f65b89bd22cbafa55b1de71a5d0f6195a68794b"} Nov 25 12:29:54 crc kubenswrapper[4693]: I1125 12:29:54.358406 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.358365623 podStartE2EDuration="2.358365623s" podCreationTimestamp="2025-11-25 12:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:29:54.358105235 +0000 UTC m=+1314.276190616" watchObservedRunningTime="2025-11-25 12:29:54.358365623 +0000 UTC m=+1314.276451024" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.453004 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.453312 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.453690 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.453706 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.456393 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.458435 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.648529 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55bfb77665-bmc5s"] Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.650180 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.664656 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55bfb77665-bmc5s"] Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.725143 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-dns-svc\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.725190 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-ovsdbserver-nb\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.725242 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-ovsdbserver-sb\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.725315 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-dns-swift-storage-0\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.725353 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-config\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.725397 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gdvw\" (UniqueName: \"kubernetes.io/projected/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-kube-api-access-9gdvw\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.826776 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-config\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.826836 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gdvw\" (UniqueName: \"kubernetes.io/projected/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-kube-api-access-9gdvw\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.826876 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-dns-svc\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.826903 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-ovsdbserver-nb\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.826946 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-ovsdbserver-sb\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.827016 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-dns-swift-storage-0\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.827931 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-config\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.828142 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-dns-swift-storage-0\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.828296 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-ovsdbserver-sb\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.828653 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-ovsdbserver-nb\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.828788 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-dns-svc\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.856278 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gdvw\" (UniqueName: \"kubernetes.io/projected/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-kube-api-access-9gdvw\") pod \"dnsmasq-dns-55bfb77665-bmc5s\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:55 crc kubenswrapper[4693]: I1125 12:29:55.978772 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:56 crc kubenswrapper[4693]: I1125 12:29:56.494857 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55bfb77665-bmc5s"] Nov 25 12:29:56 crc kubenswrapper[4693]: W1125 12:29:56.498358 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod757ab9ac_deb9_4efe_b2fc_5331d4314c0b.slice/crio-28a66205a79cb1684e0e0284d1009087f6bbbaf86dd4d7a38503f90e0dd7ad1c WatchSource:0}: Error finding container 28a66205a79cb1684e0e0284d1009087f6bbbaf86dd4d7a38503f90e0dd7ad1c: Status 404 returned error can't find the container with id 28a66205a79cb1684e0e0284d1009087f6bbbaf86dd4d7a38503f90e0dd7ad1c Nov 25 12:29:57 crc kubenswrapper[4693]: I1125 12:29:57.369333 4693 generic.go:334] "Generic (PLEG): container finished" podID="757ab9ac-deb9-4efe-b2fc-5331d4314c0b" containerID="69787debcbd631e4c56d5e7ff1f1681f00d82f912e7e4367ed575c1831c2d6b8" exitCode=0 Nov 25 12:29:57 crc kubenswrapper[4693]: I1125 12:29:57.370844 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" event={"ID":"757ab9ac-deb9-4efe-b2fc-5331d4314c0b","Type":"ContainerDied","Data":"69787debcbd631e4c56d5e7ff1f1681f00d82f912e7e4367ed575c1831c2d6b8"} Nov 25 12:29:57 crc kubenswrapper[4693]: I1125 12:29:57.370883 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" event={"ID":"757ab9ac-deb9-4efe-b2fc-5331d4314c0b","Type":"ContainerStarted","Data":"28a66205a79cb1684e0e0284d1009087f6bbbaf86dd4d7a38503f90e0dd7ad1c"} Nov 25 12:29:57 crc kubenswrapper[4693]: I1125 12:29:57.638374 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:29:57 crc kubenswrapper[4693]: I1125 12:29:57.638954 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="ceilometer-central-agent" containerID="cri-o://756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62" gracePeriod=30 Nov 25 12:29:57 crc kubenswrapper[4693]: I1125 12:29:57.639077 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="proxy-httpd" containerID="cri-o://1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083" gracePeriod=30 Nov 25 12:29:57 crc kubenswrapper[4693]: I1125 12:29:57.639115 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="sg-core" containerID="cri-o://b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd" gracePeriod=30 Nov 25 12:29:57 crc kubenswrapper[4693]: I1125 12:29:57.639147 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="ceilometer-notification-agent" containerID="cri-o://6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32" gracePeriod=30 Nov 25 12:29:57 crc kubenswrapper[4693]: I1125 12:29:57.663672 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 25 12:29:57 crc kubenswrapper[4693]: I1125 12:29:57.989453 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:29:58 crc kubenswrapper[4693]: I1125 12:29:58.380772 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" event={"ID":"757ab9ac-deb9-4efe-b2fc-5331d4314c0b","Type":"ContainerStarted","Data":"6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f"} Nov 25 12:29:58 crc kubenswrapper[4693]: I1125 12:29:58.381105 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:29:58 crc kubenswrapper[4693]: I1125 12:29:58.383615 4693 generic.go:334] "Generic (PLEG): container finished" podID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerID="1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083" exitCode=0 Nov 25 12:29:58 crc kubenswrapper[4693]: I1125 12:29:58.383644 4693 generic.go:334] "Generic (PLEG): container finished" podID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerID="b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd" exitCode=2 Nov 25 12:29:58 crc kubenswrapper[4693]: I1125 12:29:58.383654 4693 generic.go:334] "Generic (PLEG): container finished" podID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerID="756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62" exitCode=0 Nov 25 12:29:58 crc kubenswrapper[4693]: I1125 12:29:58.383698 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ebe2ec7-b555-44a1-9865-51ac733d033d","Type":"ContainerDied","Data":"1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083"} Nov 25 12:29:58 crc kubenswrapper[4693]: I1125 12:29:58.383721 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ebe2ec7-b555-44a1-9865-51ac733d033d","Type":"ContainerDied","Data":"b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd"} Nov 25 12:29:58 crc kubenswrapper[4693]: I1125 12:29:58.383734 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ebe2ec7-b555-44a1-9865-51ac733d033d","Type":"ContainerDied","Data":"756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62"} Nov 25 12:29:58 crc kubenswrapper[4693]: I1125 12:29:58.417181 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" podStartSLOduration=3.417160291 podStartE2EDuration="3.417160291s" podCreationTimestamp="2025-11-25 12:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:29:58.404168319 +0000 UTC m=+1318.322253720" watchObservedRunningTime="2025-11-25 12:29:58.417160291 +0000 UTC m=+1318.335245672" Nov 25 12:29:58 crc kubenswrapper[4693]: I1125 12:29:58.442656 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:29:58 crc kubenswrapper[4693]: I1125 12:29:58.442849 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb0f1b87-9948-4fc0-b273-0aba59284c59" containerName="nova-api-log" containerID="cri-o://f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e" gracePeriod=30 Nov 25 12:29:58 crc kubenswrapper[4693]: I1125 12:29:58.442980 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb0f1b87-9948-4fc0-b273-0aba59284c59" containerName="nova-api-api" containerID="cri-o://1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142" gracePeriod=30 Nov 25 12:29:59 crc kubenswrapper[4693]: I1125 12:29:59.393608 4693 generic.go:334] "Generic (PLEG): container finished" podID="fb0f1b87-9948-4fc0-b273-0aba59284c59" containerID="f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e" exitCode=143 Nov 25 12:29:59 crc kubenswrapper[4693]: I1125 12:29:59.395843 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb0f1b87-9948-4fc0-b273-0aba59284c59","Type":"ContainerDied","Data":"f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e"} Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.142155 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb"] Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.169593 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb"] Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.169700 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.175689 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.189661 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.208189 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h4qq\" (UniqueName: \"kubernetes.io/projected/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-kube-api-access-9h4qq\") pod \"collect-profiles-29401230-jgcvb\" (UID: \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.208473 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-config-volume\") pod \"collect-profiles-29401230-jgcvb\" (UID: \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.208605 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-secret-volume\") pod \"collect-profiles-29401230-jgcvb\" (UID: \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.310322 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h4qq\" (UniqueName: \"kubernetes.io/projected/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-kube-api-access-9h4qq\") pod \"collect-profiles-29401230-jgcvb\" (UID: \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.310383 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-config-volume\") pod \"collect-profiles-29401230-jgcvb\" (UID: \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.310436 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-secret-volume\") pod \"collect-profiles-29401230-jgcvb\" (UID: \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.311957 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-config-volume\") pod \"collect-profiles-29401230-jgcvb\" (UID: \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.320415 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-secret-volume\") pod \"collect-profiles-29401230-jgcvb\" (UID: \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.335202 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h4qq\" (UniqueName: \"kubernetes.io/projected/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-kube-api-access-9h4qq\") pod \"collect-profiles-29401230-jgcvb\" (UID: \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.511949 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:00 crc kubenswrapper[4693]: I1125 12:30:00.954757 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb"] Nov 25 12:30:01 crc kubenswrapper[4693]: I1125 12:30:01.412952 4693 generic.go:334] "Generic (PLEG): container finished" podID="8714946b-3179-48cb-b0d4-9be5bbd4d3a5" containerID="90a3cbd8ef370cf4a178b6763cbbc8f14b25f6dd3c453e63b4357f7ec542f040" exitCode=0 Nov 25 12:30:01 crc kubenswrapper[4693]: I1125 12:30:01.413018 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" event={"ID":"8714946b-3179-48cb-b0d4-9be5bbd4d3a5","Type":"ContainerDied","Data":"90a3cbd8ef370cf4a178b6763cbbc8f14b25f6dd3c453e63b4357f7ec542f040"} Nov 25 12:30:01 crc kubenswrapper[4693]: I1125 12:30:01.413079 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" event={"ID":"8714946b-3179-48cb-b0d4-9be5bbd4d3a5","Type":"ContainerStarted","Data":"d167e57d8c3797baf89a586d638ae776fe941003bd986ca7a3793eb69013d535"} Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.142156 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.148035 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.250310 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-config-data\") pod \"5ebe2ec7-b555-44a1-9865-51ac733d033d\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.250416 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpsbf\" (UniqueName: \"kubernetes.io/projected/fb0f1b87-9948-4fc0-b273-0aba59284c59-kube-api-access-vpsbf\") pod \"fb0f1b87-9948-4fc0-b273-0aba59284c59\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.250446 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-scripts\") pod \"5ebe2ec7-b555-44a1-9865-51ac733d033d\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.250537 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ebe2ec7-b555-44a1-9865-51ac733d033d-run-httpd\") pod \"5ebe2ec7-b555-44a1-9865-51ac733d033d\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.250724 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0f1b87-9948-4fc0-b273-0aba59284c59-combined-ca-bundle\") pod \"fb0f1b87-9948-4fc0-b273-0aba59284c59\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.250830 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb0f1b87-9948-4fc0-b273-0aba59284c59-logs\") pod \"fb0f1b87-9948-4fc0-b273-0aba59284c59\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.250860 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ebe2ec7-b555-44a1-9865-51ac733d033d-log-httpd\") pod \"5ebe2ec7-b555-44a1-9865-51ac733d033d\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.250885 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-ceilometer-tls-certs\") pod \"5ebe2ec7-b555-44a1-9865-51ac733d033d\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.250933 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ebe2ec7-b555-44a1-9865-51ac733d033d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5ebe2ec7-b555-44a1-9865-51ac733d033d" (UID: "5ebe2ec7-b555-44a1-9865-51ac733d033d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.250970 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-combined-ca-bundle\") pod \"5ebe2ec7-b555-44a1-9865-51ac733d033d\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.251001 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvd5t\" (UniqueName: \"kubernetes.io/projected/5ebe2ec7-b555-44a1-9865-51ac733d033d-kube-api-access-wvd5t\") pod \"5ebe2ec7-b555-44a1-9865-51ac733d033d\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.251062 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0f1b87-9948-4fc0-b273-0aba59284c59-config-data\") pod \"fb0f1b87-9948-4fc0-b273-0aba59284c59\" (UID: \"fb0f1b87-9948-4fc0-b273-0aba59284c59\") " Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.251102 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb0f1b87-9948-4fc0-b273-0aba59284c59-logs" (OuterVolumeSpecName: "logs") pod "fb0f1b87-9948-4fc0-b273-0aba59284c59" (UID: "fb0f1b87-9948-4fc0-b273-0aba59284c59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.251128 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-sg-core-conf-yaml\") pod \"5ebe2ec7-b555-44a1-9865-51ac733d033d\" (UID: \"5ebe2ec7-b555-44a1-9865-51ac733d033d\") " Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.251797 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ebe2ec7-b555-44a1-9865-51ac733d033d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5ebe2ec7-b555-44a1-9865-51ac733d033d" (UID: "5ebe2ec7-b555-44a1-9865-51ac733d033d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.251890 4693 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ebe2ec7-b555-44a1-9865-51ac733d033d-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.251910 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb0f1b87-9948-4fc0-b273-0aba59284c59-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.257191 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb0f1b87-9948-4fc0-b273-0aba59284c59-kube-api-access-vpsbf" (OuterVolumeSpecName: "kube-api-access-vpsbf") pod "fb0f1b87-9948-4fc0-b273-0aba59284c59" (UID: "fb0f1b87-9948-4fc0-b273-0aba59284c59"). InnerVolumeSpecName "kube-api-access-vpsbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.257667 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-scripts" (OuterVolumeSpecName: "scripts") pod "5ebe2ec7-b555-44a1-9865-51ac733d033d" (UID: "5ebe2ec7-b555-44a1-9865-51ac733d033d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.267311 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ebe2ec7-b555-44a1-9865-51ac733d033d-kube-api-access-wvd5t" (OuterVolumeSpecName: "kube-api-access-wvd5t") pod "5ebe2ec7-b555-44a1-9865-51ac733d033d" (UID: "5ebe2ec7-b555-44a1-9865-51ac733d033d"). InnerVolumeSpecName "kube-api-access-wvd5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.296140 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0f1b87-9948-4fc0-b273-0aba59284c59-config-data" (OuterVolumeSpecName: "config-data") pod "fb0f1b87-9948-4fc0-b273-0aba59284c59" (UID: "fb0f1b87-9948-4fc0-b273-0aba59284c59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.330824 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0f1b87-9948-4fc0-b273-0aba59284c59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb0f1b87-9948-4fc0-b273-0aba59284c59" (UID: "fb0f1b87-9948-4fc0-b273-0aba59284c59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.353759 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb0f1b87-9948-4fc0-b273-0aba59284c59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.353793 4693 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ebe2ec7-b555-44a1-9865-51ac733d033d-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.353806 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvd5t\" (UniqueName: \"kubernetes.io/projected/5ebe2ec7-b555-44a1-9865-51ac733d033d-kube-api-access-wvd5t\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.353818 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb0f1b87-9948-4fc0-b273-0aba59284c59-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.355145 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpsbf\" (UniqueName: \"kubernetes.io/projected/fb0f1b87-9948-4fc0-b273-0aba59284c59-kube-api-access-vpsbf\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.355172 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.358649 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5ebe2ec7-b555-44a1-9865-51ac733d033d" (UID: "5ebe2ec7-b555-44a1-9865-51ac733d033d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.380010 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ebe2ec7-b555-44a1-9865-51ac733d033d" (UID: "5ebe2ec7-b555-44a1-9865-51ac733d033d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.409216 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5ebe2ec7-b555-44a1-9865-51ac733d033d" (UID: "5ebe2ec7-b555-44a1-9865-51ac733d033d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.412512 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-config-data" (OuterVolumeSpecName: "config-data") pod "5ebe2ec7-b555-44a1-9865-51ac733d033d" (UID: "5ebe2ec7-b555-44a1-9865-51ac733d033d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.430897 4693 generic.go:334] "Generic (PLEG): container finished" podID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerID="6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32" exitCode=0 Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.430968 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.430979 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ebe2ec7-b555-44a1-9865-51ac733d033d","Type":"ContainerDied","Data":"6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32"} Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.431003 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ebe2ec7-b555-44a1-9865-51ac733d033d","Type":"ContainerDied","Data":"64e9ed35781dd1ac2e38fe728c40a2e5698fa6d95c274824f8943dcc3fbdea6d"} Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.431018 4693 scope.go:117] "RemoveContainer" containerID="1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.433328 4693 generic.go:334] "Generic (PLEG): container finished" podID="fb0f1b87-9948-4fc0-b273-0aba59284c59" containerID="1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142" exitCode=0 Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.433549 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.436350 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb0f1b87-9948-4fc0-b273-0aba59284c59","Type":"ContainerDied","Data":"1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142"} Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.436421 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb0f1b87-9948-4fc0-b273-0aba59284c59","Type":"ContainerDied","Data":"779a03248c1ff094e8381b79d9355a13cda47107d828408f69deb15af0294624"} Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.456931 4693 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.456963 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.456975 4693 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.456983 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ebe2ec7-b555-44a1-9865-51ac733d033d-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.510411 4693 scope.go:117] "RemoveContainer" containerID="b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.520448 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.532456 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.543340 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.552578 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.572728 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:30:02 crc kubenswrapper[4693]: E1125 12:30:02.573130 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb0f1b87-9948-4fc0-b273-0aba59284c59" containerName="nova-api-api" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.573148 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0f1b87-9948-4fc0-b273-0aba59284c59" containerName="nova-api-api" Nov 25 12:30:02 crc kubenswrapper[4693]: E1125 12:30:02.573159 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="ceilometer-notification-agent" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.573165 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="ceilometer-notification-agent" Nov 25 12:30:02 crc kubenswrapper[4693]: E1125 12:30:02.573178 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb0f1b87-9948-4fc0-b273-0aba59284c59" containerName="nova-api-log" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.573183 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0f1b87-9948-4fc0-b273-0aba59284c59" containerName="nova-api-log" Nov 25 12:30:02 crc kubenswrapper[4693]: E1125 12:30:02.573198 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="ceilometer-central-agent" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.573205 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="ceilometer-central-agent" Nov 25 12:30:02 crc kubenswrapper[4693]: E1125 12:30:02.573221 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="proxy-httpd" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.573226 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="proxy-httpd" Nov 25 12:30:02 crc kubenswrapper[4693]: E1125 12:30:02.573237 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="sg-core" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.573243 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="sg-core" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.573446 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="sg-core" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.573476 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="ceilometer-central-agent" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.573490 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb0f1b87-9948-4fc0-b273-0aba59284c59" containerName="nova-api-api" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.573505 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="proxy-httpd" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.573517 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" containerName="ceilometer-notification-agent" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.573528 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb0f1b87-9948-4fc0-b273-0aba59284c59" containerName="nova-api-log" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.575583 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.577036 4693 scope.go:117] "RemoveContainer" containerID="6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.577431 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.577669 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.577807 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.599009 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.601568 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.610554 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.613662 4693 scope.go:117] "RemoveContainer" containerID="756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.613974 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.614206 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.632656 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.658830 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.661749 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-scripts\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.661814 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-public-tls-certs\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.661855 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.662105 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.662183 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-run-httpd\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.662209 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-config-data\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.662233 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.662256 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxmws\" (UniqueName: \"kubernetes.io/projected/31d92213-9991-494c-b585-8a535cf35371-kube-api-access-xxmws\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.662272 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31d92213-9991-494c-b585-8a535cf35371-logs\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.662331 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-log-httpd\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.662539 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhk4\" (UniqueName: \"kubernetes.io/projected/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-kube-api-access-dlhk4\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.662589 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-config-data\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.662619 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.662719 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-internal-tls-certs\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.732635 4693 scope.go:117] "RemoveContainer" containerID="1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083" Nov 25 12:30:02 crc kubenswrapper[4693]: E1125 12:30:02.733723 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083\": container with ID starting with 1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083 not found: ID does not exist" containerID="1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.733765 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083"} err="failed to get container status \"1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083\": rpc error: code = NotFound desc = could not find container \"1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083\": container with ID starting with 1bcfc3f9d66a040f058f5f47a1c69b9b56b6fb322de60c6100fe8e05023ca083 not found: ID does not exist" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.733791 4693 scope.go:117] "RemoveContainer" containerID="b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd" Nov 25 12:30:02 crc kubenswrapper[4693]: E1125 12:30:02.737002 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd\": container with ID starting with b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd not found: ID does not exist" containerID="b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.737033 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd"} err="failed to get container status \"b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd\": rpc error: code = NotFound desc = could not find container \"b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd\": container with ID starting with b337a265105446a558a6a512b98a5b086b5baa900674a0df2b78d65ad76456cd not found: ID does not exist" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.737053 4693 scope.go:117] "RemoveContainer" containerID="6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32" Nov 25 12:30:02 crc kubenswrapper[4693]: E1125 12:30:02.740006 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32\": container with ID starting with 6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32 not found: ID does not exist" containerID="6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.740042 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32"} err="failed to get container status \"6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32\": rpc error: code = NotFound desc = could not find container \"6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32\": container with ID starting with 6610b85c50f34dc26a0b825b18b4b9b7f1af92262ef1ba8c1afed21ba718cd32 not found: ID does not exist" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.740061 4693 scope.go:117] "RemoveContainer" containerID="756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62" Nov 25 12:30:02 crc kubenswrapper[4693]: E1125 12:30:02.740457 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62\": container with ID starting with 756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62 not found: ID does not exist" containerID="756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.740515 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62"} err="failed to get container status \"756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62\": rpc error: code = NotFound desc = could not find container \"756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62\": container with ID starting with 756af77ae7fe680db6629a870c9b5eb31dcaa520d4a44103d94dbf5308102b62 not found: ID does not exist" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.740552 4693 scope.go:117] "RemoveContainer" containerID="1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.763895 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-scripts\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.763936 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-public-tls-certs\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.763955 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.764008 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.764054 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-run-httpd\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.764070 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-config-data\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.764087 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.764105 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxmws\" (UniqueName: \"kubernetes.io/projected/31d92213-9991-494c-b585-8a535cf35371-kube-api-access-xxmws\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.764118 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31d92213-9991-494c-b585-8a535cf35371-logs\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.764145 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-log-httpd\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.764165 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhk4\" (UniqueName: \"kubernetes.io/projected/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-kube-api-access-dlhk4\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.764190 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-config-data\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.764211 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.764243 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-internal-tls-certs\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.770498 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-log-httpd\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.770879 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31d92213-9991-494c-b585-8a535cf35371-logs\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.773414 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.773422 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-internal-tls-certs\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.774387 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-public-tls-certs\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.775710 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.777003 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.777153 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-config-data\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.778747 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-config-data\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.779283 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-scripts\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.779645 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.779820 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-run-httpd\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.784401 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxmws\" (UniqueName: \"kubernetes.io/projected/31d92213-9991-494c-b585-8a535cf35371-kube-api-access-xxmws\") pod \"nova-api-0\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.794544 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhk4\" (UniqueName: \"kubernetes.io/projected/2e6a63b2-2650-4dc4-9a37-d7be65342b5d-kube-api-access-dlhk4\") pod \"ceilometer-0\" (UID: \"2e6a63b2-2650-4dc4-9a37-d7be65342b5d\") " pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.835247 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ebe2ec7-b555-44a1-9865-51ac733d033d" path="/var/lib/kubelet/pods/5ebe2ec7-b555-44a1-9865-51ac733d033d/volumes" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.836937 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb0f1b87-9948-4fc0-b273-0aba59284c59" path="/var/lib/kubelet/pods/fb0f1b87-9948-4fc0-b273-0aba59284c59/volumes" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.839686 4693 scope.go:117] "RemoveContainer" containerID="f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.903436 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.949487 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.954194 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:02 crc kubenswrapper[4693]: I1125 12:30:02.993420 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.068590 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-config-volume\") pod \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\" (UID: \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\") " Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.068731 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-secret-volume\") pod \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\" (UID: \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\") " Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.068892 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h4qq\" (UniqueName: \"kubernetes.io/projected/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-kube-api-access-9h4qq\") pod \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\" (UID: \"8714946b-3179-48cb-b0d4-9be5bbd4d3a5\") " Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.069344 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "8714946b-3179-48cb-b0d4-9be5bbd4d3a5" (UID: "8714946b-3179-48cb-b0d4-9be5bbd4d3a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.070509 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.074488 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8714946b-3179-48cb-b0d4-9be5bbd4d3a5" (UID: "8714946b-3179-48cb-b0d4-9be5bbd4d3a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.078527 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.078742 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-kube-api-access-9h4qq" (OuterVolumeSpecName: "kube-api-access-9h4qq") pod "8714946b-3179-48cb-b0d4-9be5bbd4d3a5" (UID: "8714946b-3179-48cb-b0d4-9be5bbd4d3a5"). InnerVolumeSpecName "kube-api-access-9h4qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.089418 4693 scope.go:117] "RemoveContainer" containerID="1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142" Nov 25 12:30:03 crc kubenswrapper[4693]: E1125 12:30:03.090096 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142\": container with ID starting with 1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142 not found: ID does not exist" containerID="1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.090139 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142"} err="failed to get container status \"1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142\": rpc error: code = NotFound desc = could not find container \"1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142\": container with ID starting with 1801c3ea2f57848f5b31daff44b32d9bdd5c628aab61e0fb9de30a69eb34a142 not found: ID does not exist" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.090164 4693 scope.go:117] "RemoveContainer" containerID="f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e" Nov 25 12:30:03 crc kubenswrapper[4693]: E1125 12:30:03.091937 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e\": container with ID starting with f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e not found: ID does not exist" containerID="f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.092019 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e"} err="failed to get container status \"f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e\": rpc error: code = NotFound desc = could not find container \"f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e\": container with ID starting with f381d97147240fd873c3288cdfa114ace87d494848a54166e3990061a091079e not found: ID does not exist" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.171929 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h4qq\" (UniqueName: \"kubernetes.io/projected/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-kube-api-access-9h4qq\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.172262 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8714946b-3179-48cb-b0d4-9be5bbd4d3a5-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.383913 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.449439 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" event={"ID":"8714946b-3179-48cb-b0d4-9be5bbd4d3a5","Type":"ContainerDied","Data":"d167e57d8c3797baf89a586d638ae776fe941003bd986ca7a3793eb69013d535"} Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.449498 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d167e57d8c3797baf89a586d638ae776fe941003bd986ca7a3793eb69013d535" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.449575 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.455440 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e6a63b2-2650-4dc4-9a37-d7be65342b5d","Type":"ContainerStarted","Data":"1241c7ca69b6817d9835a2df609e99d99e32b35d9f5093dcdde24c4be63af145"} Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.474390 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.512836 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:30:03 crc kubenswrapper[4693]: W1125 12:30:03.525621 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31d92213_9991_494c_b585_8a535cf35371.slice/crio-4479731cb147094874d7dd3eef0ecce1ae6abd30b2d0b00754eac9b906b444aa WatchSource:0}: Error finding container 4479731cb147094874d7dd3eef0ecce1ae6abd30b2d0b00754eac9b906b444aa: Status 404 returned error can't find the container with id 4479731cb147094874d7dd3eef0ecce1ae6abd30b2d0b00754eac9b906b444aa Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.668847 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-gs7bm"] Nov 25 12:30:03 crc kubenswrapper[4693]: E1125 12:30:03.669403 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8714946b-3179-48cb-b0d4-9be5bbd4d3a5" containerName="collect-profiles" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.669427 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="8714946b-3179-48cb-b0d4-9be5bbd4d3a5" containerName="collect-profiles" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.669654 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="8714946b-3179-48cb-b0d4-9be5bbd4d3a5" containerName="collect-profiles" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.670453 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.672727 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.673121 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.681450 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gs7bm"] Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.781628 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gs7bm\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.781856 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-config-data\") pod \"nova-cell1-cell-mapping-gs7bm\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.781909 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hbj7\" (UniqueName: \"kubernetes.io/projected/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-kube-api-access-7hbj7\") pod \"nova-cell1-cell-mapping-gs7bm\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.781945 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-scripts\") pod \"nova-cell1-cell-mapping-gs7bm\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.884708 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gs7bm\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.884827 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-config-data\") pod \"nova-cell1-cell-mapping-gs7bm\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.884851 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hbj7\" (UniqueName: \"kubernetes.io/projected/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-kube-api-access-7hbj7\") pod \"nova-cell1-cell-mapping-gs7bm\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.884872 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-scripts\") pod \"nova-cell1-cell-mapping-gs7bm\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.888757 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gs7bm\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.889111 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-config-data\") pod \"nova-cell1-cell-mapping-gs7bm\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.889338 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-scripts\") pod \"nova-cell1-cell-mapping-gs7bm\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:03 crc kubenswrapper[4693]: I1125 12:30:03.904934 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hbj7\" (UniqueName: \"kubernetes.io/projected/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-kube-api-access-7hbj7\") pod \"nova-cell1-cell-mapping-gs7bm\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:04 crc kubenswrapper[4693]: I1125 12:30:04.004554 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:04 crc kubenswrapper[4693]: I1125 12:30:04.466412 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"31d92213-9991-494c-b585-8a535cf35371","Type":"ContainerStarted","Data":"62e08ab6181b18c09300e8b8a5ad8c0f5395d2ab629a1f492a57be8b5d2cd8ab"} Nov 25 12:30:04 crc kubenswrapper[4693]: I1125 12:30:04.466767 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"31d92213-9991-494c-b585-8a535cf35371","Type":"ContainerStarted","Data":"577dee7dabe45075ed15b7057dc625ddca8c848f4902cd08090a98e3dfee8a40"} Nov 25 12:30:04 crc kubenswrapper[4693]: I1125 12:30:04.466783 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"31d92213-9991-494c-b585-8a535cf35371","Type":"ContainerStarted","Data":"4479731cb147094874d7dd3eef0ecce1ae6abd30b2d0b00754eac9b906b444aa"} Nov 25 12:30:04 crc kubenswrapper[4693]: I1125 12:30:04.469473 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e6a63b2-2650-4dc4-9a37-d7be65342b5d","Type":"ContainerStarted","Data":"fc479facf53a52aa9160a2b44dc01abb4a31aecfdf74f5de1a21b87665afe797"} Nov 25 12:30:04 crc kubenswrapper[4693]: I1125 12:30:04.485269 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gs7bm"] Nov 25 12:30:04 crc kubenswrapper[4693]: W1125 12:30:04.487736 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6259bc4_d44b_4b7e_8228_2cfa55f87da8.slice/crio-c308c7877a9c4b7ab432f4ec7ef3a25a811e7b7cec98dc2ada2b11701aeab1a0 WatchSource:0}: Error finding container c308c7877a9c4b7ab432f4ec7ef3a25a811e7b7cec98dc2ada2b11701aeab1a0: Status 404 returned error can't find the container with id c308c7877a9c4b7ab432f4ec7ef3a25a811e7b7cec98dc2ada2b11701aeab1a0 Nov 25 12:30:04 crc kubenswrapper[4693]: I1125 12:30:04.542919 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.542899935 podStartE2EDuration="2.542899935s" podCreationTimestamp="2025-11-25 12:30:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:30:04.48780157 +0000 UTC m=+1324.405886961" watchObservedRunningTime="2025-11-25 12:30:04.542899935 +0000 UTC m=+1324.460985306" Nov 25 12:30:05 crc kubenswrapper[4693]: I1125 12:30:05.480817 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gs7bm" event={"ID":"d6259bc4-d44b-4b7e-8228-2cfa55f87da8","Type":"ContainerStarted","Data":"fdcd0d5de05a34f8c953892764b755589f78335d9849c66857802595412bd4a2"} Nov 25 12:30:05 crc kubenswrapper[4693]: I1125 12:30:05.481163 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gs7bm" event={"ID":"d6259bc4-d44b-4b7e-8228-2cfa55f87da8","Type":"ContainerStarted","Data":"c308c7877a9c4b7ab432f4ec7ef3a25a811e7b7cec98dc2ada2b11701aeab1a0"} Nov 25 12:30:05 crc kubenswrapper[4693]: I1125 12:30:05.483943 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e6a63b2-2650-4dc4-9a37-d7be65342b5d","Type":"ContainerStarted","Data":"b6ba0c057499b8913206bf21f020c2420ee6c8bf19d566f356feed26abd74de9"} Nov 25 12:30:05 crc kubenswrapper[4693]: I1125 12:30:05.497112 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-gs7bm" podStartSLOduration=2.497090264 podStartE2EDuration="2.497090264s" podCreationTimestamp="2025-11-25 12:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:30:05.495721046 +0000 UTC m=+1325.413806437" watchObservedRunningTime="2025-11-25 12:30:05.497090264 +0000 UTC m=+1325.415175645" Nov 25 12:30:05 crc kubenswrapper[4693]: I1125 12:30:05.980267 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.053940 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64dbf5859c-4hs4t"] Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.054175 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" podUID="c5ad2242-431b-4d4e-a815-3623305d8b38" containerName="dnsmasq-dns" containerID="cri-o://6aa0e19f9203e8e6ad67c47c41e5b12c1bea14ca0b0e7d77efbd711698dab350" gracePeriod=10 Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.509730 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e6a63b2-2650-4dc4-9a37-d7be65342b5d","Type":"ContainerStarted","Data":"ccd2129514ec7bae5bddc1cb8d55e43b26d15dd51e90011bc5001afee6a7b392"} Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.512222 4693 generic.go:334] "Generic (PLEG): container finished" podID="c5ad2242-431b-4d4e-a815-3623305d8b38" containerID="6aa0e19f9203e8e6ad67c47c41e5b12c1bea14ca0b0e7d77efbd711698dab350" exitCode=0 Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.513659 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" event={"ID":"c5ad2242-431b-4d4e-a815-3623305d8b38","Type":"ContainerDied","Data":"6aa0e19f9203e8e6ad67c47c41e5b12c1bea14ca0b0e7d77efbd711698dab350"} Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.624204 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.740893 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-dns-swift-storage-0\") pod \"c5ad2242-431b-4d4e-a815-3623305d8b38\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.740979 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-ovsdbserver-sb\") pod \"c5ad2242-431b-4d4e-a815-3623305d8b38\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.741048 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-ovsdbserver-nb\") pod \"c5ad2242-431b-4d4e-a815-3623305d8b38\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.741116 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-dns-svc\") pod \"c5ad2242-431b-4d4e-a815-3623305d8b38\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.741844 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqx48\" (UniqueName: \"kubernetes.io/projected/c5ad2242-431b-4d4e-a815-3623305d8b38-kube-api-access-qqx48\") pod \"c5ad2242-431b-4d4e-a815-3623305d8b38\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.741978 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-config\") pod \"c5ad2242-431b-4d4e-a815-3623305d8b38\" (UID: \"c5ad2242-431b-4d4e-a815-3623305d8b38\") " Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.754694 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ad2242-431b-4d4e-a815-3623305d8b38-kube-api-access-qqx48" (OuterVolumeSpecName: "kube-api-access-qqx48") pod "c5ad2242-431b-4d4e-a815-3623305d8b38" (UID: "c5ad2242-431b-4d4e-a815-3623305d8b38"). InnerVolumeSpecName "kube-api-access-qqx48". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.798178 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5ad2242-431b-4d4e-a815-3623305d8b38" (UID: "c5ad2242-431b-4d4e-a815-3623305d8b38"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.799863 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5ad2242-431b-4d4e-a815-3623305d8b38" (UID: "c5ad2242-431b-4d4e-a815-3623305d8b38"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.802959 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c5ad2242-431b-4d4e-a815-3623305d8b38" (UID: "c5ad2242-431b-4d4e-a815-3623305d8b38"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.814938 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5ad2242-431b-4d4e-a815-3623305d8b38" (UID: "c5ad2242-431b-4d4e-a815-3623305d8b38"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.816865 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-config" (OuterVolumeSpecName: "config") pod "c5ad2242-431b-4d4e-a815-3623305d8b38" (UID: "c5ad2242-431b-4d4e-a815-3623305d8b38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.844860 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.844907 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.844919 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.844931 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.844941 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqx48\" (UniqueName: \"kubernetes.io/projected/c5ad2242-431b-4d4e-a815-3623305d8b38-kube-api-access-qqx48\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:06 crc kubenswrapper[4693]: I1125 12:30:06.844953 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5ad2242-431b-4d4e-a815-3623305d8b38-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:07 crc kubenswrapper[4693]: I1125 12:30:07.525030 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2e6a63b2-2650-4dc4-9a37-d7be65342b5d","Type":"ContainerStarted","Data":"f808aaccbef2b0a0a03ebfec8dfd96665f2028d1999f7b8c8d93a26253d74afd"} Nov 25 12:30:07 crc kubenswrapper[4693]: I1125 12:30:07.525401 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 25 12:30:07 crc kubenswrapper[4693]: I1125 12:30:07.528560 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" event={"ID":"c5ad2242-431b-4d4e-a815-3623305d8b38","Type":"ContainerDied","Data":"8524ccdf01b8a0729334854aad167cf93d4a598c1750cba3463173fc7e169434"} Nov 25 12:30:07 crc kubenswrapper[4693]: I1125 12:30:07.528598 4693 scope.go:117] "RemoveContainer" containerID="6aa0e19f9203e8e6ad67c47c41e5b12c1bea14ca0b0e7d77efbd711698dab350" Nov 25 12:30:07 crc kubenswrapper[4693]: I1125 12:30:07.528706 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" Nov 25 12:30:07 crc kubenswrapper[4693]: I1125 12:30:07.561716 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.672400309 podStartE2EDuration="5.561675323s" podCreationTimestamp="2025-11-25 12:30:02 +0000 UTC" firstStartedPulling="2025-11-25 12:30:03.385938245 +0000 UTC m=+1323.304023626" lastFinishedPulling="2025-11-25 12:30:07.275213249 +0000 UTC m=+1327.193298640" observedRunningTime="2025-11-25 12:30:07.543196819 +0000 UTC m=+1327.461282210" watchObservedRunningTime="2025-11-25 12:30:07.561675323 +0000 UTC m=+1327.479760714" Nov 25 12:30:07 crc kubenswrapper[4693]: I1125 12:30:07.572914 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64dbf5859c-4hs4t"] Nov 25 12:30:07 crc kubenswrapper[4693]: I1125 12:30:07.573323 4693 scope.go:117] "RemoveContainer" containerID="a9032ae2a779d2ac3d3906ec30f878d49f24bf7839634ffab00d9bfc0501927c" Nov 25 12:30:07 crc kubenswrapper[4693]: I1125 12:30:07.582044 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64dbf5859c-4hs4t"] Nov 25 12:30:08 crc kubenswrapper[4693]: I1125 12:30:08.825172 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5ad2242-431b-4d4e-a815-3623305d8b38" path="/var/lib/kubelet/pods/c5ad2242-431b-4d4e-a815-3623305d8b38/volumes" Nov 25 12:30:10 crc kubenswrapper[4693]: I1125 12:30:10.557707 4693 generic.go:334] "Generic (PLEG): container finished" podID="d6259bc4-d44b-4b7e-8228-2cfa55f87da8" containerID="fdcd0d5de05a34f8c953892764b755589f78335d9849c66857802595412bd4a2" exitCode=0 Nov 25 12:30:10 crc kubenswrapper[4693]: I1125 12:30:10.557751 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gs7bm" event={"ID":"d6259bc4-d44b-4b7e-8228-2cfa55f87da8","Type":"ContainerDied","Data":"fdcd0d5de05a34f8c953892764b755589f78335d9849c66857802595412bd4a2"} Nov 25 12:30:11 crc kubenswrapper[4693]: I1125 12:30:11.541817 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-64dbf5859c-4hs4t" podUID="c5ad2242-431b-4d4e-a815-3623305d8b38" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.191:5353: i/o timeout" Nov 25 12:30:11 crc kubenswrapper[4693]: I1125 12:30:11.903905 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:11 crc kubenswrapper[4693]: I1125 12:30:11.941677 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hbj7\" (UniqueName: \"kubernetes.io/projected/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-kube-api-access-7hbj7\") pod \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " Nov 25 12:30:11 crc kubenswrapper[4693]: I1125 12:30:11.942444 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-config-data\") pod \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " Nov 25 12:30:11 crc kubenswrapper[4693]: I1125 12:30:11.942491 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-combined-ca-bundle\") pod \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " Nov 25 12:30:11 crc kubenswrapper[4693]: I1125 12:30:11.942662 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-scripts\") pod \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\" (UID: \"d6259bc4-d44b-4b7e-8228-2cfa55f87da8\") " Nov 25 12:30:11 crc kubenswrapper[4693]: I1125 12:30:11.948133 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-kube-api-access-7hbj7" (OuterVolumeSpecName: "kube-api-access-7hbj7") pod "d6259bc4-d44b-4b7e-8228-2cfa55f87da8" (UID: "d6259bc4-d44b-4b7e-8228-2cfa55f87da8"). InnerVolumeSpecName "kube-api-access-7hbj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:11 crc kubenswrapper[4693]: I1125 12:30:11.948872 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-scripts" (OuterVolumeSpecName: "scripts") pod "d6259bc4-d44b-4b7e-8228-2cfa55f87da8" (UID: "d6259bc4-d44b-4b7e-8228-2cfa55f87da8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:11 crc kubenswrapper[4693]: I1125 12:30:11.973054 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-config-data" (OuterVolumeSpecName: "config-data") pod "d6259bc4-d44b-4b7e-8228-2cfa55f87da8" (UID: "d6259bc4-d44b-4b7e-8228-2cfa55f87da8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:11 crc kubenswrapper[4693]: I1125 12:30:11.987583 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6259bc4-d44b-4b7e-8228-2cfa55f87da8" (UID: "d6259bc4-d44b-4b7e-8228-2cfa55f87da8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.044811 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.044857 4693 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-scripts\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.044870 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hbj7\" (UniqueName: \"kubernetes.io/projected/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-kube-api-access-7hbj7\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.044884 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6259bc4-d44b-4b7e-8228-2cfa55f87da8-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.576199 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gs7bm" event={"ID":"d6259bc4-d44b-4b7e-8228-2cfa55f87da8","Type":"ContainerDied","Data":"c308c7877a9c4b7ab432f4ec7ef3a25a811e7b7cec98dc2ada2b11701aeab1a0"} Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.576244 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c308c7877a9c4b7ab432f4ec7ef3a25a811e7b7cec98dc2ada2b11701aeab1a0" Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.576273 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gs7bm" Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.759724 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.761571 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="31d92213-9991-494c-b585-8a535cf35371" containerName="nova-api-log" containerID="cri-o://577dee7dabe45075ed15b7057dc625ddca8c848f4902cd08090a98e3dfee8a40" gracePeriod=30 Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.761776 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="31d92213-9991-494c-b585-8a535cf35371" containerName="nova-api-api" containerID="cri-o://62e08ab6181b18c09300e8b8a5ad8c0f5395d2ab629a1f492a57be8b5d2cd8ab" gracePeriod=30 Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.773472 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.773680 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="721e10a7-c21e-449c-8186-aa83c1d7f97d" containerName="nova-scheduler-scheduler" containerID="cri-o://a75608c79f42b61590a27290652e89b196514588a4e403c6cb702feab8766c79" gracePeriod=30 Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.799033 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.799311 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerName="nova-metadata-log" containerID="cri-o://10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e" gracePeriod=30 Nov 25 12:30:12 crc kubenswrapper[4693]: I1125 12:30:12.799528 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerName="nova-metadata-metadata" containerID="cri-o://c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0" gracePeriod=30 Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.619044 4693 generic.go:334] "Generic (PLEG): container finished" podID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerID="10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e" exitCode=143 Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.619161 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4","Type":"ContainerDied","Data":"10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e"} Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.642072 4693 generic.go:334] "Generic (PLEG): container finished" podID="31d92213-9991-494c-b585-8a535cf35371" containerID="62e08ab6181b18c09300e8b8a5ad8c0f5395d2ab629a1f492a57be8b5d2cd8ab" exitCode=0 Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.642106 4693 generic.go:334] "Generic (PLEG): container finished" podID="31d92213-9991-494c-b585-8a535cf35371" containerID="577dee7dabe45075ed15b7057dc625ddca8c848f4902cd08090a98e3dfee8a40" exitCode=143 Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.642141 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"31d92213-9991-494c-b585-8a535cf35371","Type":"ContainerDied","Data":"62e08ab6181b18c09300e8b8a5ad8c0f5395d2ab629a1f492a57be8b5d2cd8ab"} Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.642185 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"31d92213-9991-494c-b585-8a535cf35371","Type":"ContainerDied","Data":"577dee7dabe45075ed15b7057dc625ddca8c848f4902cd08090a98e3dfee8a40"} Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.917013 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.977316 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-config-data\") pod \"31d92213-9991-494c-b585-8a535cf35371\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.977831 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31d92213-9991-494c-b585-8a535cf35371-logs\") pod \"31d92213-9991-494c-b585-8a535cf35371\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.977857 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-combined-ca-bundle\") pod \"31d92213-9991-494c-b585-8a535cf35371\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.977899 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-public-tls-certs\") pod \"31d92213-9991-494c-b585-8a535cf35371\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.977959 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxmws\" (UniqueName: \"kubernetes.io/projected/31d92213-9991-494c-b585-8a535cf35371-kube-api-access-xxmws\") pod \"31d92213-9991-494c-b585-8a535cf35371\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.977989 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-internal-tls-certs\") pod \"31d92213-9991-494c-b585-8a535cf35371\" (UID: \"31d92213-9991-494c-b585-8a535cf35371\") " Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.978390 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31d92213-9991-494c-b585-8a535cf35371-logs" (OuterVolumeSpecName: "logs") pod "31d92213-9991-494c-b585-8a535cf35371" (UID: "31d92213-9991-494c-b585-8a535cf35371"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.978603 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31d92213-9991-494c-b585-8a535cf35371-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:13 crc kubenswrapper[4693]: I1125 12:30:13.985047 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d92213-9991-494c-b585-8a535cf35371-kube-api-access-xxmws" (OuterVolumeSpecName: "kube-api-access-xxmws") pod "31d92213-9991-494c-b585-8a535cf35371" (UID: "31d92213-9991-494c-b585-8a535cf35371"). InnerVolumeSpecName "kube-api-access-xxmws". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.006873 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31d92213-9991-494c-b585-8a535cf35371" (UID: "31d92213-9991-494c-b585-8a535cf35371"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.021099 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-config-data" (OuterVolumeSpecName: "config-data") pod "31d92213-9991-494c-b585-8a535cf35371" (UID: "31d92213-9991-494c-b585-8a535cf35371"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.038742 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "31d92213-9991-494c-b585-8a535cf35371" (UID: "31d92213-9991-494c-b585-8a535cf35371"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.061039 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "31d92213-9991-494c-b585-8a535cf35371" (UID: "31d92213-9991-494c-b585-8a535cf35371"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.080763 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxmws\" (UniqueName: \"kubernetes.io/projected/31d92213-9991-494c-b585-8a535cf35371-kube-api-access-xxmws\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.080806 4693 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.080816 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.080826 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.080834 4693 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31d92213-9991-494c-b585-8a535cf35371-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.653284 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"31d92213-9991-494c-b585-8a535cf35371","Type":"ContainerDied","Data":"4479731cb147094874d7dd3eef0ecce1ae6abd30b2d0b00754eac9b906b444aa"} Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.653334 4693 scope.go:117] "RemoveContainer" containerID="62e08ab6181b18c09300e8b8a5ad8c0f5395d2ab629a1f492a57be8b5d2cd8ab" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.653463 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.659835 4693 generic.go:334] "Generic (PLEG): container finished" podID="721e10a7-c21e-449c-8186-aa83c1d7f97d" containerID="a75608c79f42b61590a27290652e89b196514588a4e403c6cb702feab8766c79" exitCode=0 Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.659877 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"721e10a7-c21e-449c-8186-aa83c1d7f97d","Type":"ContainerDied","Data":"a75608c79f42b61590a27290652e89b196514588a4e403c6cb702feab8766c79"} Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.689234 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.689452 4693 scope.go:117] "RemoveContainer" containerID="577dee7dabe45075ed15b7057dc625ddca8c848f4902cd08090a98e3dfee8a40" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.703476 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.715428 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 25 12:30:14 crc kubenswrapper[4693]: E1125 12:30:14.715965 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6259bc4-d44b-4b7e-8228-2cfa55f87da8" containerName="nova-manage" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.715983 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6259bc4-d44b-4b7e-8228-2cfa55f87da8" containerName="nova-manage" Nov 25 12:30:14 crc kubenswrapper[4693]: E1125 12:30:14.716002 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ad2242-431b-4d4e-a815-3623305d8b38" containerName="init" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.716009 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ad2242-431b-4d4e-a815-3623305d8b38" containerName="init" Nov 25 12:30:14 crc kubenswrapper[4693]: E1125 12:30:14.716024 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d92213-9991-494c-b585-8a535cf35371" containerName="nova-api-log" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.716031 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d92213-9991-494c-b585-8a535cf35371" containerName="nova-api-log" Nov 25 12:30:14 crc kubenswrapper[4693]: E1125 12:30:14.716046 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d92213-9991-494c-b585-8a535cf35371" containerName="nova-api-api" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.716053 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d92213-9991-494c-b585-8a535cf35371" containerName="nova-api-api" Nov 25 12:30:14 crc kubenswrapper[4693]: E1125 12:30:14.716090 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ad2242-431b-4d4e-a815-3623305d8b38" containerName="dnsmasq-dns" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.716097 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ad2242-431b-4d4e-a815-3623305d8b38" containerName="dnsmasq-dns" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.716311 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6259bc4-d44b-4b7e-8228-2cfa55f87da8" containerName="nova-manage" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.716326 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5ad2242-431b-4d4e-a815-3623305d8b38" containerName="dnsmasq-dns" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.716342 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d92213-9991-494c-b585-8a535cf35371" containerName="nova-api-api" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.716362 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d92213-9991-494c-b585-8a535cf35371" containerName="nova-api-log" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.717544 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.720428 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.720758 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.720865 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.736982 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.801897 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37605f96-b59e-45ad-b177-dad562d6af05-public-tls-certs\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.802117 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37605f96-b59e-45ad-b177-dad562d6af05-internal-tls-certs\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.802149 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37605f96-b59e-45ad-b177-dad562d6af05-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.802201 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37605f96-b59e-45ad-b177-dad562d6af05-config-data\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.802245 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37605f96-b59e-45ad-b177-dad562d6af05-logs\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.802302 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns6wb\" (UniqueName: \"kubernetes.io/projected/37605f96-b59e-45ad-b177-dad562d6af05-kube-api-access-ns6wb\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.831057 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d92213-9991-494c-b585-8a535cf35371" path="/var/lib/kubelet/pods/31d92213-9991-494c-b585-8a535cf35371/volumes" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.904108 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37605f96-b59e-45ad-b177-dad562d6af05-config-data\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.904234 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37605f96-b59e-45ad-b177-dad562d6af05-logs\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.904351 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns6wb\" (UniqueName: \"kubernetes.io/projected/37605f96-b59e-45ad-b177-dad562d6af05-kube-api-access-ns6wb\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.904437 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37605f96-b59e-45ad-b177-dad562d6af05-public-tls-certs\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.904466 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37605f96-b59e-45ad-b177-dad562d6af05-internal-tls-certs\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.904502 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37605f96-b59e-45ad-b177-dad562d6af05-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.905098 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37605f96-b59e-45ad-b177-dad562d6af05-logs\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.910363 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37605f96-b59e-45ad-b177-dad562d6af05-config-data\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.910450 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37605f96-b59e-45ad-b177-dad562d6af05-internal-tls-certs\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.911725 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37605f96-b59e-45ad-b177-dad562d6af05-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.912300 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37605f96-b59e-45ad-b177-dad562d6af05-public-tls-certs\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.922193 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns6wb\" (UniqueName: \"kubernetes.io/projected/37605f96-b59e-45ad-b177-dad562d6af05-kube-api-access-ns6wb\") pod \"nova-api-0\" (UID: \"37605f96-b59e-45ad-b177-dad562d6af05\") " pod="openstack/nova-api-0" Nov 25 12:30:14 crc kubenswrapper[4693]: I1125 12:30:14.990241 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.036472 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.111393 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx44q\" (UniqueName: \"kubernetes.io/projected/721e10a7-c21e-449c-8186-aa83c1d7f97d-kube-api-access-bx44q\") pod \"721e10a7-c21e-449c-8186-aa83c1d7f97d\" (UID: \"721e10a7-c21e-449c-8186-aa83c1d7f97d\") " Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.111502 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721e10a7-c21e-449c-8186-aa83c1d7f97d-config-data\") pod \"721e10a7-c21e-449c-8186-aa83c1d7f97d\" (UID: \"721e10a7-c21e-449c-8186-aa83c1d7f97d\") " Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.111631 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721e10a7-c21e-449c-8186-aa83c1d7f97d-combined-ca-bundle\") pod \"721e10a7-c21e-449c-8186-aa83c1d7f97d\" (UID: \"721e10a7-c21e-449c-8186-aa83c1d7f97d\") " Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.117424 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/721e10a7-c21e-449c-8186-aa83c1d7f97d-kube-api-access-bx44q" (OuterVolumeSpecName: "kube-api-access-bx44q") pod "721e10a7-c21e-449c-8186-aa83c1d7f97d" (UID: "721e10a7-c21e-449c-8186-aa83c1d7f97d"). InnerVolumeSpecName "kube-api-access-bx44q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.142058 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/721e10a7-c21e-449c-8186-aa83c1d7f97d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "721e10a7-c21e-449c-8186-aa83c1d7f97d" (UID: "721e10a7-c21e-449c-8186-aa83c1d7f97d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.152405 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/721e10a7-c21e-449c-8186-aa83c1d7f97d-config-data" (OuterVolumeSpecName: "config-data") pod "721e10a7-c21e-449c-8186-aa83c1d7f97d" (UID: "721e10a7-c21e-449c-8186-aa83c1d7f97d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.214173 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721e10a7-c21e-449c-8186-aa83c1d7f97d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.214223 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx44q\" (UniqueName: \"kubernetes.io/projected/721e10a7-c21e-449c-8186-aa83c1d7f97d-kube-api-access-bx44q\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.214237 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/721e10a7-c21e-449c-8186-aa83c1d7f97d-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.492296 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 25 12:30:15 crc kubenswrapper[4693]: W1125 12:30:15.495904 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37605f96_b59e_45ad_b177_dad562d6af05.slice/crio-38d32cbebabc7f9a0913fa5603bf59038cf9b8836d3e1601cabdf4ea893ed1c6 WatchSource:0}: Error finding container 38d32cbebabc7f9a0913fa5603bf59038cf9b8836d3e1601cabdf4ea893ed1c6: Status 404 returned error can't find the container with id 38d32cbebabc7f9a0913fa5603bf59038cf9b8836d3e1601cabdf4ea893ed1c6 Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.668970 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37605f96-b59e-45ad-b177-dad562d6af05","Type":"ContainerStarted","Data":"38d32cbebabc7f9a0913fa5603bf59038cf9b8836d3e1601cabdf4ea893ed1c6"} Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.671433 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"721e10a7-c21e-449c-8186-aa83c1d7f97d","Type":"ContainerDied","Data":"ed782d2a70dacf1f9424b80b322394f74869696586c927c95dc2654d4be801af"} Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.671465 4693 scope.go:117] "RemoveContainer" containerID="a75608c79f42b61590a27290652e89b196514588a4e403c6cb702feab8766c79" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.671610 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.709325 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.718761 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.734127 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:30:15 crc kubenswrapper[4693]: E1125 12:30:15.734543 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="721e10a7-c21e-449c-8186-aa83c1d7f97d" containerName="nova-scheduler-scheduler" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.734556 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="721e10a7-c21e-449c-8186-aa83c1d7f97d" containerName="nova-scheduler-scheduler" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.734766 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="721e10a7-c21e-449c-8186-aa83c1d7f97d" containerName="nova-scheduler-scheduler" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.735435 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.739963 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.745316 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.836515 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcl48\" (UniqueName: \"kubernetes.io/projected/4eca8fd3-dd93-493a-9278-1749de83eae1-kube-api-access-fcl48\") pod \"nova-scheduler-0\" (UID: \"4eca8fd3-dd93-493a-9278-1749de83eae1\") " pod="openstack/nova-scheduler-0" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.836955 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eca8fd3-dd93-493a-9278-1749de83eae1-config-data\") pod \"nova-scheduler-0\" (UID: \"4eca8fd3-dd93-493a-9278-1749de83eae1\") " pod="openstack/nova-scheduler-0" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.836984 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eca8fd3-dd93-493a-9278-1749de83eae1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4eca8fd3-dd93-493a-9278-1749de83eae1\") " pod="openstack/nova-scheduler-0" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.934217 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": read tcp 10.217.0.2:48572->10.217.0.194:8775: read: connection reset by peer" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.934686 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.194:8775/\": read tcp 10.217.0.2:48570->10.217.0.194:8775: read: connection reset by peer" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.938497 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eca8fd3-dd93-493a-9278-1749de83eae1-config-data\") pod \"nova-scheduler-0\" (UID: \"4eca8fd3-dd93-493a-9278-1749de83eae1\") " pod="openstack/nova-scheduler-0" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.938555 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eca8fd3-dd93-493a-9278-1749de83eae1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4eca8fd3-dd93-493a-9278-1749de83eae1\") " pod="openstack/nova-scheduler-0" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.938719 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcl48\" (UniqueName: \"kubernetes.io/projected/4eca8fd3-dd93-493a-9278-1749de83eae1-kube-api-access-fcl48\") pod \"nova-scheduler-0\" (UID: \"4eca8fd3-dd93-493a-9278-1749de83eae1\") " pod="openstack/nova-scheduler-0" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.952365 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eca8fd3-dd93-493a-9278-1749de83eae1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4eca8fd3-dd93-493a-9278-1749de83eae1\") " pod="openstack/nova-scheduler-0" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.955952 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4eca8fd3-dd93-493a-9278-1749de83eae1-config-data\") pod \"nova-scheduler-0\" (UID: \"4eca8fd3-dd93-493a-9278-1749de83eae1\") " pod="openstack/nova-scheduler-0" Nov 25 12:30:15 crc kubenswrapper[4693]: I1125 12:30:15.958904 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcl48\" (UniqueName: \"kubernetes.io/projected/4eca8fd3-dd93-493a-9278-1749de83eae1-kube-api-access-fcl48\") pod \"nova-scheduler-0\" (UID: \"4eca8fd3-dd93-493a-9278-1749de83eae1\") " pod="openstack/nova-scheduler-0" Nov 25 12:30:16 crc kubenswrapper[4693]: I1125 12:30:16.062063 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.494854 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.510710 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 25 12:30:18 crc kubenswrapper[4693]: W1125 12:30:16.515902 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eca8fd3_dd93_493a_9278_1749de83eae1.slice/crio-76e67e833ef88862b82b9b0383ceada5d547308c1b1c0f5ce8ba67f36d71f9f7 WatchSource:0}: Error finding container 76e67e833ef88862b82b9b0383ceada5d547308c1b1c0f5ce8ba67f36d71f9f7: Status 404 returned error can't find the container with id 76e67e833ef88862b82b9b0383ceada5d547308c1b1c0f5ce8ba67f36d71f9f7 Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.552502 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-config-data\") pod \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.552569 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-combined-ca-bundle\") pod \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.552643 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhg87\" (UniqueName: \"kubernetes.io/projected/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-kube-api-access-bhg87\") pod \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.552679 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-nova-metadata-tls-certs\") pod \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.552788 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-logs\") pod \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\" (UID: \"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4\") " Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.553913 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-logs" (OuterVolumeSpecName: "logs") pod "4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" (UID: "4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.557300 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-kube-api-access-bhg87" (OuterVolumeSpecName: "kube-api-access-bhg87") pod "4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" (UID: "4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4"). InnerVolumeSpecName "kube-api-access-bhg87". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.584531 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" (UID: "4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.589598 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-config-data" (OuterVolumeSpecName: "config-data") pod "4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" (UID: "4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.615952 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" (UID: "4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.655628 4693 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.655655 4693 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-logs\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.655665 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.655673 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.655682 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhg87\" (UniqueName: \"kubernetes.io/projected/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4-kube-api-access-bhg87\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.686349 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4eca8fd3-dd93-493a-9278-1749de83eae1","Type":"ContainerStarted","Data":"76e67e833ef88862b82b9b0383ceada5d547308c1b1c0f5ce8ba67f36d71f9f7"} Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.688206 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37605f96-b59e-45ad-b177-dad562d6af05","Type":"ContainerStarted","Data":"c619be0aa2fc3ba128d6b5839aa350ae764be242b3bb786c015974a33f47d6f6"} Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.688227 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"37605f96-b59e-45ad-b177-dad562d6af05","Type":"ContainerStarted","Data":"34407df7d99074379c2704d8e222781163af39ae357bebe2520f7dc72bf47a2a"} Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.697681 4693 generic.go:334] "Generic (PLEG): container finished" podID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerID="c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0" exitCode=0 Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.697722 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4","Type":"ContainerDied","Data":"c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0"} Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.697745 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.697761 4693 scope.go:117] "RemoveContainer" containerID="c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.697751 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4","Type":"ContainerDied","Data":"84c1600fd2120debb0726f9c819b83221a911beaed30c0858417e0b9ddc1725f"} Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.732600 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7325732499999997 podStartE2EDuration="2.73257325s" podCreationTimestamp="2025-11-25 12:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:30:16.712403418 +0000 UTC m=+1336.630488799" watchObservedRunningTime="2025-11-25 12:30:16.73257325 +0000 UTC m=+1336.650658631" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.743636 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.744015 4693 scope.go:117] "RemoveContainer" containerID="10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.765870 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.772210 4693 scope.go:117] "RemoveContainer" containerID="c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0" Nov 25 12:30:18 crc kubenswrapper[4693]: E1125 12:30:16.772653 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0\": container with ID starting with c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0 not found: ID does not exist" containerID="c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.772696 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0"} err="failed to get container status \"c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0\": rpc error: code = NotFound desc = could not find container \"c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0\": container with ID starting with c80d3d4b6c7a68a7bf61760e72b53088f6997754442eb10daf71e44dcbbedfa0 not found: ID does not exist" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.772726 4693 scope.go:117] "RemoveContainer" containerID="10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e" Nov 25 12:30:18 crc kubenswrapper[4693]: E1125 12:30:16.773306 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e\": container with ID starting with 10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e not found: ID does not exist" containerID="10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.773349 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e"} err="failed to get container status \"10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e\": rpc error: code = NotFound desc = could not find container \"10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e\": container with ID starting with 10896e0871182a9c265d25b6140898f29901770ce963d31d3a203845b0c1ab5e not found: ID does not exist" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.776780 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:30:18 crc kubenswrapper[4693]: E1125 12:30:16.777276 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerName="nova-metadata-log" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.777293 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerName="nova-metadata-log" Nov 25 12:30:18 crc kubenswrapper[4693]: E1125 12:30:16.777311 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerName="nova-metadata-metadata" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.777321 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerName="nova-metadata-metadata" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.777568 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerName="nova-metadata-log" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.777601 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" containerName="nova-metadata-metadata" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.778794 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.780826 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.781053 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.792921 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.824235 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4" path="/var/lib/kubelet/pods/4fe8afed-c8ef-437c-99d0-ecbb1a9e1fa4/volumes" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.824937 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="721e10a7-c21e-449c-8186-aa83c1d7f97d" path="/var/lib/kubelet/pods/721e10a7-c21e-449c-8186-aa83c1d7f97d/volumes" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.859167 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/599f11ff-5079-4815-be45-5ffac410eb82-config-data\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.859277 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/599f11ff-5079-4815-be45-5ffac410eb82-logs\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.859317 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7vmk\" (UniqueName: \"kubernetes.io/projected/599f11ff-5079-4815-be45-5ffac410eb82-kube-api-access-g7vmk\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.859419 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599f11ff-5079-4815-be45-5ffac410eb82-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.859489 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/599f11ff-5079-4815-be45-5ffac410eb82-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.961306 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/599f11ff-5079-4815-be45-5ffac410eb82-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.961384 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/599f11ff-5079-4815-be45-5ffac410eb82-config-data\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.961451 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/599f11ff-5079-4815-be45-5ffac410eb82-logs\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.961476 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7vmk\" (UniqueName: \"kubernetes.io/projected/599f11ff-5079-4815-be45-5ffac410eb82-kube-api-access-g7vmk\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.961517 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599f11ff-5079-4815-be45-5ffac410eb82-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.962507 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/599f11ff-5079-4815-be45-5ffac410eb82-logs\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.966333 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/599f11ff-5079-4815-be45-5ffac410eb82-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.966747 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/599f11ff-5079-4815-be45-5ffac410eb82-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.967444 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/599f11ff-5079-4815-be45-5ffac410eb82-config-data\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:16.979789 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7vmk\" (UniqueName: \"kubernetes.io/projected/599f11ff-5079-4815-be45-5ffac410eb82-kube-api-access-g7vmk\") pod \"nova-metadata-0\" (UID: \"599f11ff-5079-4815-be45-5ffac410eb82\") " pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:17.097325 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:17.719151 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4eca8fd3-dd93-493a-9278-1749de83eae1","Type":"ContainerStarted","Data":"0a668d2fcdcfb9aca340c89d33a116855f1e66a1bef70d1b423f4cbb6aa16707"} Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:17.743701 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.743678785 podStartE2EDuration="2.743678785s" podCreationTimestamp="2025-11-25 12:30:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:30:17.737546383 +0000 UTC m=+1337.655631774" watchObservedRunningTime="2025-11-25 12:30:17.743678785 +0000 UTC m=+1337.661764166" Nov 25 12:30:18 crc kubenswrapper[4693]: I1125 12:30:18.836813 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 25 12:30:18 crc kubenswrapper[4693]: W1125 12:30:18.837589 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod599f11ff_5079_4815_be45_5ffac410eb82.slice/crio-962f8b6c8a0189bd753b6528860bbd597ec928184aef37cfdcf074728af3e88d WatchSource:0}: Error finding container 962f8b6c8a0189bd753b6528860bbd597ec928184aef37cfdcf074728af3e88d: Status 404 returned error can't find the container with id 962f8b6c8a0189bd753b6528860bbd597ec928184aef37cfdcf074728af3e88d Nov 25 12:30:19 crc kubenswrapper[4693]: I1125 12:30:19.735863 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"599f11ff-5079-4815-be45-5ffac410eb82","Type":"ContainerStarted","Data":"3a2d9e62972641f7c8b3ca2abf0a2332c2b598d963f6a070ff1451243988e3af"} Nov 25 12:30:19 crc kubenswrapper[4693]: I1125 12:30:19.736107 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"599f11ff-5079-4815-be45-5ffac410eb82","Type":"ContainerStarted","Data":"447ab757de855c5dafd11db928ab6503e0cc3227a5c517a629d1e838e2cee9d0"} Nov 25 12:30:19 crc kubenswrapper[4693]: I1125 12:30:19.736118 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"599f11ff-5079-4815-be45-5ffac410eb82","Type":"ContainerStarted","Data":"962f8b6c8a0189bd753b6528860bbd597ec928184aef37cfdcf074728af3e88d"} Nov 25 12:30:19 crc kubenswrapper[4693]: I1125 12:30:19.763622 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.763604419 podStartE2EDuration="3.763604419s" podCreationTimestamp="2025-11-25 12:30:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:30:19.756519802 +0000 UTC m=+1339.674605183" watchObservedRunningTime="2025-11-25 12:30:19.763604419 +0000 UTC m=+1339.681689800" Nov 25 12:30:21 crc kubenswrapper[4693]: I1125 12:30:21.062614 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 25 12:30:22 crc kubenswrapper[4693]: I1125 12:30:22.098217 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 12:30:22 crc kubenswrapper[4693]: I1125 12:30:22.098577 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 25 12:30:25 crc kubenswrapper[4693]: I1125 12:30:25.037325 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 12:30:25 crc kubenswrapper[4693]: I1125 12:30:25.037661 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 25 12:30:26 crc kubenswrapper[4693]: I1125 12:30:26.051584 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37605f96-b59e-45ad-b177-dad562d6af05" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 12:30:26 crc kubenswrapper[4693]: I1125 12:30:26.051597 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="37605f96-b59e-45ad-b177-dad562d6af05" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 12:30:26 crc kubenswrapper[4693]: I1125 12:30:26.062434 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 25 12:30:26 crc kubenswrapper[4693]: I1125 12:30:26.087427 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 25 12:30:26 crc kubenswrapper[4693]: I1125 12:30:26.849746 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 25 12:30:27 crc kubenswrapper[4693]: I1125 12:30:27.098247 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 12:30:27 crc kubenswrapper[4693]: I1125 12:30:27.098301 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 25 12:30:28 crc kubenswrapper[4693]: I1125 12:30:28.115622 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="599f11ff-5079-4815-be45-5ffac410eb82" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 12:30:28 crc kubenswrapper[4693]: I1125 12:30:28.115633 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="599f11ff-5079-4815-be45-5ffac410eb82" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 25 12:30:32 crc kubenswrapper[4693]: I1125 12:30:32.913181 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 25 12:30:35 crc kubenswrapper[4693]: I1125 12:30:35.044148 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 12:30:35 crc kubenswrapper[4693]: I1125 12:30:35.046045 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 12:30:35 crc kubenswrapper[4693]: I1125 12:30:35.051671 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 25 12:30:35 crc kubenswrapper[4693]: I1125 12:30:35.053978 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 12:30:35 crc kubenswrapper[4693]: I1125 12:30:35.119308 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:30:35 crc kubenswrapper[4693]: I1125 12:30:35.119461 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:30:35 crc kubenswrapper[4693]: I1125 12:30:35.921896 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 25 12:30:35 crc kubenswrapper[4693]: I1125 12:30:35.927851 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 25 12:30:37 crc kubenswrapper[4693]: I1125 12:30:37.102227 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 12:30:37 crc kubenswrapper[4693]: I1125 12:30:37.105832 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 25 12:30:37 crc kubenswrapper[4693]: I1125 12:30:37.107743 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 12:30:37 crc kubenswrapper[4693]: I1125 12:30:37.978115 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 25 12:30:46 crc kubenswrapper[4693]: I1125 12:30:46.396881 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 12:30:47 crc kubenswrapper[4693]: I1125 12:30:47.386220 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 12:30:51 crc kubenswrapper[4693]: I1125 12:30:51.106140 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="dcb107e2-5742-4030-a7fc-a8eb016f449b" containerName="rabbitmq" containerID="cri-o://a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b" gracePeriod=604796 Nov 25 12:30:51 crc kubenswrapper[4693]: I1125 12:30:51.462618 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="4cd38986-be2a-4adf-b594-352740498acd" containerName="rabbitmq" containerID="cri-o://a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac" gracePeriod=604796 Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.755497 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.915822 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-confd\") pod \"dcb107e2-5742-4030-a7fc-a8eb016f449b\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.915900 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-plugins-conf\") pod \"dcb107e2-5742-4030-a7fc-a8eb016f449b\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.915956 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dcb107e2-5742-4030-a7fc-a8eb016f449b-pod-info\") pod \"dcb107e2-5742-4030-a7fc-a8eb016f449b\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.915992 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs9mj\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-kube-api-access-bs9mj\") pod \"dcb107e2-5742-4030-a7fc-a8eb016f449b\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.916042 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dcb107e2-5742-4030-a7fc-a8eb016f449b-erlang-cookie-secret\") pod \"dcb107e2-5742-4030-a7fc-a8eb016f449b\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.916077 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-config-data\") pod \"dcb107e2-5742-4030-a7fc-a8eb016f449b\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.916151 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-erlang-cookie\") pod \"dcb107e2-5742-4030-a7fc-a8eb016f449b\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.916227 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"dcb107e2-5742-4030-a7fc-a8eb016f449b\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.916261 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-plugins\") pod \"dcb107e2-5742-4030-a7fc-a8eb016f449b\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.916296 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-server-conf\") pod \"dcb107e2-5742-4030-a7fc-a8eb016f449b\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.916346 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-tls\") pod \"dcb107e2-5742-4030-a7fc-a8eb016f449b\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.929302 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dcb107e2-5742-4030-a7fc-a8eb016f449b" (UID: "dcb107e2-5742-4030-a7fc-a8eb016f449b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.929318 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dcb107e2-5742-4030-a7fc-a8eb016f449b" (UID: "dcb107e2-5742-4030-a7fc-a8eb016f449b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.930065 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dcb107e2-5742-4030-a7fc-a8eb016f449b" (UID: "dcb107e2-5742-4030-a7fc-a8eb016f449b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.964316 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dcb107e2-5742-4030-a7fc-a8eb016f449b-pod-info" (OuterVolumeSpecName: "pod-info") pod "dcb107e2-5742-4030-a7fc-a8eb016f449b" (UID: "dcb107e2-5742-4030-a7fc-a8eb016f449b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.975639 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb107e2-5742-4030-a7fc-a8eb016f449b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dcb107e2-5742-4030-a7fc-a8eb016f449b" (UID: "dcb107e2-5742-4030-a7fc-a8eb016f449b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.975797 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "dcb107e2-5742-4030-a7fc-a8eb016f449b" (UID: "dcb107e2-5742-4030-a7fc-a8eb016f449b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.975949 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-kube-api-access-bs9mj" (OuterVolumeSpecName: "kube-api-access-bs9mj") pod "dcb107e2-5742-4030-a7fc-a8eb016f449b" (UID: "dcb107e2-5742-4030-a7fc-a8eb016f449b"). InnerVolumeSpecName "kube-api-access-bs9mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:57 crc kubenswrapper[4693]: I1125 12:30:57.978553 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "dcb107e2-5742-4030-a7fc-a8eb016f449b" (UID: "dcb107e2-5742-4030-a7fc-a8eb016f449b"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.023213 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-config-data" (OuterVolumeSpecName: "config-data") pod "dcb107e2-5742-4030-a7fc-a8eb016f449b" (UID: "dcb107e2-5742-4030-a7fc-a8eb016f449b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.023664 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-config-data\") pod \"dcb107e2-5742-4030-a7fc-a8eb016f449b\" (UID: \"dcb107e2-5742-4030-a7fc-a8eb016f449b\") " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.024114 4693 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dcb107e2-5742-4030-a7fc-a8eb016f449b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.024154 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.024178 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.024187 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.024196 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.024204 4693 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.024212 4693 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dcb107e2-5742-4030-a7fc-a8eb016f449b-pod-info\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.024220 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs9mj\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-kube-api-access-bs9mj\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: W1125 12:30:58.024242 4693 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/dcb107e2-5742-4030-a7fc-a8eb016f449b/volumes/kubernetes.io~configmap/config-data Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.024271 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-config-data" (OuterVolumeSpecName: "config-data") pod "dcb107e2-5742-4030-a7fc-a8eb016f449b" (UID: "dcb107e2-5742-4030-a7fc-a8eb016f449b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.048297 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.107585 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-server-conf" (OuterVolumeSpecName: "server-conf") pod "dcb107e2-5742-4030-a7fc-a8eb016f449b" (UID: "dcb107e2-5742-4030-a7fc-a8eb016f449b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.125944 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.125979 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.125993 4693 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dcb107e2-5742-4030-a7fc-a8eb016f449b-server-conf\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.133253 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.143555 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dcb107e2-5742-4030-a7fc-a8eb016f449b" (UID: "dcb107e2-5742-4030-a7fc-a8eb016f449b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.163333 4693 generic.go:334] "Generic (PLEG): container finished" podID="4cd38986-be2a-4adf-b594-352740498acd" containerID="a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac" exitCode=0 Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.163496 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4cd38986-be2a-4adf-b594-352740498acd","Type":"ContainerDied","Data":"a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac"} Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.163546 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4cd38986-be2a-4adf-b594-352740498acd","Type":"ContainerDied","Data":"2e200845d976e19a3f2eba053630629d292c18d961b86603a944d2d04ec52c74"} Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.163570 4693 scope.go:117] "RemoveContainer" containerID="a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.163593 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.171474 4693 generic.go:334] "Generic (PLEG): container finished" podID="dcb107e2-5742-4030-a7fc-a8eb016f449b" containerID="a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b" exitCode=0 Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.171600 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dcb107e2-5742-4030-a7fc-a8eb016f449b","Type":"ContainerDied","Data":"a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b"} Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.171636 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dcb107e2-5742-4030-a7fc-a8eb016f449b","Type":"ContainerDied","Data":"1337c98d3719b6e7375b663fba3337f9dd6a61b07d8a63dd7ee90fcea7aa96ad"} Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.171738 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.192536 4693 scope.go:117] "RemoveContainer" containerID="dd269fd355a10e4218bec3961a1dc96d8230751cf324653c82a3fbf6e7cc6e89" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.227643 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-tls\") pod \"4cd38986-be2a-4adf-b594-352740498acd\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.227685 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-erlang-cookie\") pod \"4cd38986-be2a-4adf-b594-352740498acd\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.227704 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"4cd38986-be2a-4adf-b594-352740498acd\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.227753 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-server-conf\") pod \"4cd38986-be2a-4adf-b594-352740498acd\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.227797 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-plugins\") pod \"4cd38986-be2a-4adf-b594-352740498acd\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.227820 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-config-data\") pod \"4cd38986-be2a-4adf-b594-352740498acd\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.227878 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cd38986-be2a-4adf-b594-352740498acd-erlang-cookie-secret\") pod \"4cd38986-be2a-4adf-b594-352740498acd\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.227925 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vpzc\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-kube-api-access-2vpzc\") pod \"4cd38986-be2a-4adf-b594-352740498acd\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.227957 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-plugins-conf\") pod \"4cd38986-be2a-4adf-b594-352740498acd\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.227992 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-confd\") pod \"4cd38986-be2a-4adf-b594-352740498acd\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.228008 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cd38986-be2a-4adf-b594-352740498acd-pod-info\") pod \"4cd38986-be2a-4adf-b594-352740498acd\" (UID: \"4cd38986-be2a-4adf-b594-352740498acd\") " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.228406 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dcb107e2-5742-4030-a7fc-a8eb016f449b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.228755 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.229214 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4cd38986-be2a-4adf-b594-352740498acd" (UID: "4cd38986-be2a-4adf-b594-352740498acd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.231265 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4cd38986-be2a-4adf-b594-352740498acd" (UID: "4cd38986-be2a-4adf-b594-352740498acd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.232405 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4cd38986-be2a-4adf-b594-352740498acd-pod-info" (OuterVolumeSpecName: "pod-info") pod "4cd38986-be2a-4adf-b594-352740498acd" (UID: "4cd38986-be2a-4adf-b594-352740498acd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.233625 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4cd38986-be2a-4adf-b594-352740498acd" (UID: "4cd38986-be2a-4adf-b594-352740498acd"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.234104 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4cd38986-be2a-4adf-b594-352740498acd" (UID: "4cd38986-be2a-4adf-b594-352740498acd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.238842 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "4cd38986-be2a-4adf-b594-352740498acd" (UID: "4cd38986-be2a-4adf-b594-352740498acd"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.239238 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.239592 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-kube-api-access-2vpzc" (OuterVolumeSpecName: "kube-api-access-2vpzc") pod "4cd38986-be2a-4adf-b594-352740498acd" (UID: "4cd38986-be2a-4adf-b594-352740498acd"). InnerVolumeSpecName "kube-api-access-2vpzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.240261 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd38986-be2a-4adf-b594-352740498acd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4cd38986-be2a-4adf-b594-352740498acd" (UID: "4cd38986-be2a-4adf-b594-352740498acd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.263346 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 12:30:58 crc kubenswrapper[4693]: E1125 12:30:58.269558 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb107e2-5742-4030-a7fc-a8eb016f449b" containerName="setup-container" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.269597 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb107e2-5742-4030-a7fc-a8eb016f449b" containerName="setup-container" Nov 25 12:30:58 crc kubenswrapper[4693]: E1125 12:30:58.269639 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd38986-be2a-4adf-b594-352740498acd" containerName="rabbitmq" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.269648 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd38986-be2a-4adf-b594-352740498acd" containerName="rabbitmq" Nov 25 12:30:58 crc kubenswrapper[4693]: E1125 12:30:58.269670 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd38986-be2a-4adf-b594-352740498acd" containerName="setup-container" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.269677 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd38986-be2a-4adf-b594-352740498acd" containerName="setup-container" Nov 25 12:30:58 crc kubenswrapper[4693]: E1125 12:30:58.269696 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb107e2-5742-4030-a7fc-a8eb016f449b" containerName="rabbitmq" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.269704 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb107e2-5742-4030-a7fc-a8eb016f449b" containerName="rabbitmq" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.269913 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd38986-be2a-4adf-b594-352740498acd" containerName="rabbitmq" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.269944 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb107e2-5742-4030-a7fc-a8eb016f449b" containerName="rabbitmq" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.271149 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.283430 4693 scope.go:117] "RemoveContainer" containerID="a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.283611 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.283654 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.283703 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.283791 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.283939 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.286844 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.287320 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-drzcq" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.289969 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 12:30:58 crc kubenswrapper[4693]: E1125 12:30:58.297298 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac\": container with ID starting with a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac not found: ID does not exist" containerID="a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.297343 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac"} err="failed to get container status \"a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac\": rpc error: code = NotFound desc = could not find container \"a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac\": container with ID starting with a21746229540f262587792fbcf55409048fdea4cc3032d18f6d63524905a5bac not found: ID does not exist" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.297387 4693 scope.go:117] "RemoveContainer" containerID="dd269fd355a10e4218bec3961a1dc96d8230751cf324653c82a3fbf6e7cc6e89" Nov 25 12:30:58 crc kubenswrapper[4693]: E1125 12:30:58.299258 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd269fd355a10e4218bec3961a1dc96d8230751cf324653c82a3fbf6e7cc6e89\": container with ID starting with dd269fd355a10e4218bec3961a1dc96d8230751cf324653c82a3fbf6e7cc6e89 not found: ID does not exist" containerID="dd269fd355a10e4218bec3961a1dc96d8230751cf324653c82a3fbf6e7cc6e89" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.299298 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd269fd355a10e4218bec3961a1dc96d8230751cf324653c82a3fbf6e7cc6e89"} err="failed to get container status \"dd269fd355a10e4218bec3961a1dc96d8230751cf324653c82a3fbf6e7cc6e89\": rpc error: code = NotFound desc = could not find container \"dd269fd355a10e4218bec3961a1dc96d8230751cf324653c82a3fbf6e7cc6e89\": container with ID starting with dd269fd355a10e4218bec3961a1dc96d8230751cf324653c82a3fbf6e7cc6e89 not found: ID does not exist" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.299354 4693 scope.go:117] "RemoveContainer" containerID="a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.325336 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-config-data" (OuterVolumeSpecName: "config-data") pod "4cd38986-be2a-4adf-b594-352740498acd" (UID: "4cd38986-be2a-4adf-b594-352740498acd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.330036 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.330058 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.330066 4693 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4cd38986-be2a-4adf-b594-352740498acd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.330076 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vpzc\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-kube-api-access-2vpzc\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.330085 4693 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.330093 4693 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4cd38986-be2a-4adf-b594-352740498acd-pod-info\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.330100 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.330109 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.330129 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.366002 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-server-conf" (OuterVolumeSpecName: "server-conf") pod "4cd38986-be2a-4adf-b594-352740498acd" (UID: "4cd38986-be2a-4adf-b594-352740498acd"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.384461 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.396946 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4cd38986-be2a-4adf-b594-352740498acd" (UID: "4cd38986-be2a-4adf-b594-352740498acd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.439808 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98b0bc68-9551-407d-8390-66688e8255d3-config-data\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.439853 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98b0bc68-9551-407d-8390-66688e8255d3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.439884 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98b0bc68-9551-407d-8390-66688e8255d3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.439921 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98b0bc68-9551-407d-8390-66688e8255d3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.439968 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98b0bc68-9551-407d-8390-66688e8255d3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.439991 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98b0bc68-9551-407d-8390-66688e8255d3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.440013 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98b0bc68-9551-407d-8390-66688e8255d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.440070 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98b0bc68-9551-407d-8390-66688e8255d3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.440141 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.440294 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sghm\" (UniqueName: \"kubernetes.io/projected/98b0bc68-9551-407d-8390-66688e8255d3-kube-api-access-6sghm\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.440359 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98b0bc68-9551-407d-8390-66688e8255d3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.440504 4693 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4cd38986-be2a-4adf-b594-352740498acd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.440518 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.440529 4693 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4cd38986-be2a-4adf-b594-352740498acd-server-conf\") on node \"crc\" DevicePath \"\"" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.509054 4693 scope.go:117] "RemoveContainer" containerID="20aea4082df2dee881c850d7220bcf6413466cc6d9622f80568f4609d4cda435" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.523841 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.536698 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.542102 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sghm\" (UniqueName: \"kubernetes.io/projected/98b0bc68-9551-407d-8390-66688e8255d3-kube-api-access-6sghm\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.542154 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98b0bc68-9551-407d-8390-66688e8255d3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.542903 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98b0bc68-9551-407d-8390-66688e8255d3-config-data\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.542928 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98b0bc68-9551-407d-8390-66688e8255d3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.542952 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98b0bc68-9551-407d-8390-66688e8255d3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.543011 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98b0bc68-9551-407d-8390-66688e8255d3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.543081 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98b0bc68-9551-407d-8390-66688e8255d3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.543100 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98b0bc68-9551-407d-8390-66688e8255d3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.543114 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98b0bc68-9551-407d-8390-66688e8255d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.543195 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98b0bc68-9551-407d-8390-66688e8255d3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.543315 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.543639 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.560657 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/98b0bc68-9551-407d-8390-66688e8255d3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.565096 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/98b0bc68-9551-407d-8390-66688e8255d3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.565816 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98b0bc68-9551-407d-8390-66688e8255d3-config-data\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.567631 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/98b0bc68-9551-407d-8390-66688e8255d3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.579134 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/98b0bc68-9551-407d-8390-66688e8255d3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.579202 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/98b0bc68-9551-407d-8390-66688e8255d3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.579524 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/98b0bc68-9551-407d-8390-66688e8255d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.579647 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/98b0bc68-9551-407d-8390-66688e8255d3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.579698 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.581771 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.587877 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/98b0bc68-9551-407d-8390-66688e8255d3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.590025 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sghm\" (UniqueName: \"kubernetes.io/projected/98b0bc68-9551-407d-8390-66688e8255d3-kube-api-access-6sghm\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.602508 4693 scope.go:117] "RemoveContainer" containerID="a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.602859 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.608831 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.609102 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.609206 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f892l" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.609230 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.609385 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.609529 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 25 12:30:58 crc kubenswrapper[4693]: E1125 12:30:58.612022 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b\": container with ID starting with a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b not found: ID does not exist" containerID="a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.612079 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b"} err="failed to get container status \"a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b\": rpc error: code = NotFound desc = could not find container \"a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b\": container with ID starting with a200054daecf1fd2485e07a013a91274498a777eae6d015876a40d4351c01f3b not found: ID does not exist" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.612106 4693 scope.go:117] "RemoveContainer" containerID="20aea4082df2dee881c850d7220bcf6413466cc6d9622f80568f4609d4cda435" Nov 25 12:30:58 crc kubenswrapper[4693]: E1125 12:30:58.612408 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20aea4082df2dee881c850d7220bcf6413466cc6d9622f80568f4609d4cda435\": container with ID starting with 20aea4082df2dee881c850d7220bcf6413466cc6d9622f80568f4609d4cda435 not found: ID does not exist" containerID="20aea4082df2dee881c850d7220bcf6413466cc6d9622f80568f4609d4cda435" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.612432 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20aea4082df2dee881c850d7220bcf6413466cc6d9622f80568f4609d4cda435"} err="failed to get container status \"20aea4082df2dee881c850d7220bcf6413466cc6d9622f80568f4609d4cda435\": rpc error: code = NotFound desc = could not find container \"20aea4082df2dee881c850d7220bcf6413466cc6d9622f80568f4609d4cda435\": container with ID starting with 20aea4082df2dee881c850d7220bcf6413466cc6d9622f80568f4609d4cda435 not found: ID does not exist" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.618553 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"98b0bc68-9551-407d-8390-66688e8255d3\") " pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.628482 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.746495 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.746760 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.746797 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.746815 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.746847 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.746870 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.746899 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqbq6\" (UniqueName: \"kubernetes.io/projected/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-kube-api-access-xqbq6\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.746921 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.746948 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.747161 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.747394 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.823661 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd38986-be2a-4adf-b594-352740498acd" path="/var/lib/kubelet/pods/4cd38986-be2a-4adf-b594-352740498acd/volumes" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.824459 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcb107e2-5742-4030-a7fc-a8eb016f449b" path="/var/lib/kubelet/pods/dcb107e2-5742-4030-a7fc-a8eb016f449b/volumes" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.849556 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.849633 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.849693 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.849728 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.849755 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.849785 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.849806 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.849835 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.849860 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.849886 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqbq6\" (UniqueName: \"kubernetes.io/projected/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-kube-api-access-xqbq6\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.849909 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.849987 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.850896 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.850901 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.850984 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.855052 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.855592 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.855590 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.856053 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.856972 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.858347 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.869349 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqbq6\" (UniqueName: \"kubernetes.io/projected/5c73e56b-c0f3-4d6d-9e33-26fe0d552e24-kube-api-access-xqbq6\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.882988 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24\") " pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.909799 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 25 12:30:58 crc kubenswrapper[4693]: I1125 12:30:58.964771 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:30:59 crc kubenswrapper[4693]: I1125 12:30:59.247056 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 25 12:30:59 crc kubenswrapper[4693]: W1125 12:30:59.530335 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c73e56b_c0f3_4d6d_9e33_26fe0d552e24.slice/crio-f0c2b4a5b95b5536c49afc7bc6b32d51e62b866fc1189556bd25dc8515b691ec WatchSource:0}: Error finding container f0c2b4a5b95b5536c49afc7bc6b32d51e62b866fc1189556bd25dc8515b691ec: Status 404 returned error can't find the container with id f0c2b4a5b95b5536c49afc7bc6b32d51e62b866fc1189556bd25dc8515b691ec Nov 25 12:30:59 crc kubenswrapper[4693]: I1125 12:30:59.539135 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.248284 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"98b0bc68-9551-407d-8390-66688e8255d3","Type":"ContainerStarted","Data":"dd06927d93d6e53e4eb6afe9945836357695a8db1546977a9c304400c6fb9e64"} Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.250173 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24","Type":"ContainerStarted","Data":"f0c2b4a5b95b5536c49afc7bc6b32d51e62b866fc1189556bd25dc8515b691ec"} Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.487439 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f7944d86c-v9q9x"] Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.489412 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.492489 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.498609 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f7944d86c-v9q9x"] Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.600928 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-openstack-edpm-ipam\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.601422 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp5p4\" (UniqueName: \"kubernetes.io/projected/abe9905e-f88c-4e8b-b2fd-798bfca37283-kube-api-access-dp5p4\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.601459 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-ovsdbserver-nb\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.601480 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-dns-swift-storage-0\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.601583 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-config\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.601634 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-ovsdbserver-sb\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.601659 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-dns-svc\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.703095 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-openstack-edpm-ipam\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.703176 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp5p4\" (UniqueName: \"kubernetes.io/projected/abe9905e-f88c-4e8b-b2fd-798bfca37283-kube-api-access-dp5p4\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.703222 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-ovsdbserver-nb\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.703259 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-dns-swift-storage-0\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.703329 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-config\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.703431 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-ovsdbserver-sb\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.703464 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-dns-svc\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.704292 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-ovsdbserver-nb\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.704356 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-ovsdbserver-sb\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.704506 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-dns-swift-storage-0\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.704864 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-dns-svc\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.704886 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-config\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.705050 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-openstack-edpm-ipam\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.723303 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp5p4\" (UniqueName: \"kubernetes.io/projected/abe9905e-f88c-4e8b-b2fd-798bfca37283-kube-api-access-dp5p4\") pod \"dnsmasq-dns-7f7944d86c-v9q9x\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:00 crc kubenswrapper[4693]: I1125 12:31:00.807923 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:01 crc kubenswrapper[4693]: I1125 12:31:01.261671 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"98b0bc68-9551-407d-8390-66688e8255d3","Type":"ContainerStarted","Data":"8a1fecf27d297f4d25dce1e39735b3e18f615af26f438b33f90e5ade63e538e8"} Nov 25 12:31:01 crc kubenswrapper[4693]: I1125 12:31:01.594131 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f7944d86c-v9q9x"] Nov 25 12:31:01 crc kubenswrapper[4693]: W1125 12:31:01.601398 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabe9905e_f88c_4e8b_b2fd_798bfca37283.slice/crio-83901dce7d9f9ed3fdeb67866a9b7a698e56aadceeed0e6b7d10008b6745bc78 WatchSource:0}: Error finding container 83901dce7d9f9ed3fdeb67866a9b7a698e56aadceeed0e6b7d10008b6745bc78: Status 404 returned error can't find the container with id 83901dce7d9f9ed3fdeb67866a9b7a698e56aadceeed0e6b7d10008b6745bc78 Nov 25 12:31:02 crc kubenswrapper[4693]: I1125 12:31:02.275314 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24","Type":"ContainerStarted","Data":"258d50467e079aa3932a783ecc9db8e11b003cc4080dcedd485643b40ef63df0"} Nov 25 12:31:02 crc kubenswrapper[4693]: I1125 12:31:02.277825 4693 generic.go:334] "Generic (PLEG): container finished" podID="abe9905e-f88c-4e8b-b2fd-798bfca37283" containerID="d299682dc8a00512d0cc1e7eb1ce24a4ca77c69348854de49ac08ddc961b9c72" exitCode=0 Nov 25 12:31:02 crc kubenswrapper[4693]: I1125 12:31:02.278014 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" event={"ID":"abe9905e-f88c-4e8b-b2fd-798bfca37283","Type":"ContainerDied","Data":"d299682dc8a00512d0cc1e7eb1ce24a4ca77c69348854de49ac08ddc961b9c72"} Nov 25 12:31:02 crc kubenswrapper[4693]: I1125 12:31:02.278075 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" event={"ID":"abe9905e-f88c-4e8b-b2fd-798bfca37283","Type":"ContainerStarted","Data":"83901dce7d9f9ed3fdeb67866a9b7a698e56aadceeed0e6b7d10008b6745bc78"} Nov 25 12:31:03 crc kubenswrapper[4693]: I1125 12:31:03.289559 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" event={"ID":"abe9905e-f88c-4e8b-b2fd-798bfca37283","Type":"ContainerStarted","Data":"0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e"} Nov 25 12:31:03 crc kubenswrapper[4693]: I1125 12:31:03.311664 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" podStartSLOduration=3.311648987 podStartE2EDuration="3.311648987s" podCreationTimestamp="2025-11-25 12:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:31:03.307266155 +0000 UTC m=+1383.225351546" watchObservedRunningTime="2025-11-25 12:31:03.311648987 +0000 UTC m=+1383.229734368" Nov 25 12:31:04 crc kubenswrapper[4693]: I1125 12:31:04.297742 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:05 crc kubenswrapper[4693]: I1125 12:31:05.114303 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:31:05 crc kubenswrapper[4693]: I1125 12:31:05.114418 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:31:06 crc kubenswrapper[4693]: I1125 12:31:06.804128 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 25 12:31:06 crc kubenswrapper[4693]: I1125 12:31:06.805564 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 12:31:06 crc kubenswrapper[4693]: I1125 12:31:06.810904 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 25 12:31:06 crc kubenswrapper[4693]: I1125 12:31:06.811497 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 25 12:31:06 crc kubenswrapper[4693]: I1125 12:31:06.859335 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 25 12:31:06 crc kubenswrapper[4693]: I1125 12:31:06.943272 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06d8269b-d001-4d31-8397-28abccde6858-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"06d8269b-d001-4d31-8397-28abccde6858\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 12:31:06 crc kubenswrapper[4693]: I1125 12:31:06.943410 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06d8269b-d001-4d31-8397-28abccde6858-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"06d8269b-d001-4d31-8397-28abccde6858\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 12:31:07 crc kubenswrapper[4693]: I1125 12:31:07.045523 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06d8269b-d001-4d31-8397-28abccde6858-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"06d8269b-d001-4d31-8397-28abccde6858\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 12:31:07 crc kubenswrapper[4693]: I1125 12:31:07.045918 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06d8269b-d001-4d31-8397-28abccde6858-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"06d8269b-d001-4d31-8397-28abccde6858\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 12:31:07 crc kubenswrapper[4693]: I1125 12:31:07.046038 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06d8269b-d001-4d31-8397-28abccde6858-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"06d8269b-d001-4d31-8397-28abccde6858\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 12:31:07 crc kubenswrapper[4693]: I1125 12:31:07.065292 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06d8269b-d001-4d31-8397-28abccde6858-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"06d8269b-d001-4d31-8397-28abccde6858\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 12:31:07 crc kubenswrapper[4693]: I1125 12:31:07.155619 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 12:31:07 crc kubenswrapper[4693]: I1125 12:31:07.415970 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 25 12:31:08 crc kubenswrapper[4693]: I1125 12:31:08.339532 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"06d8269b-d001-4d31-8397-28abccde6858","Type":"ContainerStarted","Data":"4dd5807cd695cd3a974d027292d36dbf1a57dc06cd212d185da042350ccc26d2"} Nov 25 12:31:08 crc kubenswrapper[4693]: I1125 12:31:08.339860 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"06d8269b-d001-4d31-8397-28abccde6858","Type":"ContainerStarted","Data":"14cb5f605bc25180655e053f25ef87acfe23a9d152353530c6a48db2719587e0"} Nov 25 12:31:08 crc kubenswrapper[4693]: I1125 12:31:08.356649 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.356627524 podStartE2EDuration="2.356627524s" podCreationTimestamp="2025-11-25 12:31:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:31:08.353133566 +0000 UTC m=+1388.271218947" watchObservedRunningTime="2025-11-25 12:31:08.356627524 +0000 UTC m=+1388.274712905" Nov 25 12:31:09 crc kubenswrapper[4693]: I1125 12:31:09.355658 4693 generic.go:334] "Generic (PLEG): container finished" podID="06d8269b-d001-4d31-8397-28abccde6858" containerID="4dd5807cd695cd3a974d027292d36dbf1a57dc06cd212d185da042350ccc26d2" exitCode=0 Nov 25 12:31:09 crc kubenswrapper[4693]: I1125 12:31:09.355735 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"06d8269b-d001-4d31-8397-28abccde6858","Type":"ContainerDied","Data":"4dd5807cd695cd3a974d027292d36dbf1a57dc06cd212d185da042350ccc26d2"} Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.682456 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.809642 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.820159 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06d8269b-d001-4d31-8397-28abccde6858-kube-api-access\") pod \"06d8269b-d001-4d31-8397-28abccde6858\" (UID: \"06d8269b-d001-4d31-8397-28abccde6858\") " Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.820296 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06d8269b-d001-4d31-8397-28abccde6858-kubelet-dir\") pod \"06d8269b-d001-4d31-8397-28abccde6858\" (UID: \"06d8269b-d001-4d31-8397-28abccde6858\") " Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.820848 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06d8269b-d001-4d31-8397-28abccde6858-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "06d8269b-d001-4d31-8397-28abccde6858" (UID: "06d8269b-d001-4d31-8397-28abccde6858"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.834280 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d8269b-d001-4d31-8397-28abccde6858-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "06d8269b-d001-4d31-8397-28abccde6858" (UID: "06d8269b-d001-4d31-8397-28abccde6858"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.903004 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55bfb77665-bmc5s"] Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.903271 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" podUID="757ab9ac-deb9-4efe-b2fc-5331d4314c0b" containerName="dnsmasq-dns" containerID="cri-o://6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f" gracePeriod=10 Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.922965 4693 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06d8269b-d001-4d31-8397-28abccde6858-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.922997 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06d8269b-d001-4d31-8397-28abccde6858-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.983582 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" podUID="757ab9ac-deb9-4efe-b2fc-5331d4314c0b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.201:5353: connect: connection refused" Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.997467 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d5cf5b645-f87gr"] Nov 25 12:31:10 crc kubenswrapper[4693]: E1125 12:31:10.997885 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d8269b-d001-4d31-8397-28abccde6858" containerName="pruner" Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.997903 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d8269b-d001-4d31-8397-28abccde6858" containerName="pruner" Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.998175 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d8269b-d001-4d31-8397-28abccde6858" containerName="pruner" Nov 25 12:31:10 crc kubenswrapper[4693]: I1125 12:31:10.999395 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.015235 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d5cf5b645-f87gr"] Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.127777 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.127846 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95dxv\" (UniqueName: \"kubernetes.io/projected/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-kube-api-access-95dxv\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.127957 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-dns-swift-storage-0\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.127994 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-dns-svc\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.128029 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.128235 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.128284 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-config\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.230742 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.230811 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95dxv\" (UniqueName: \"kubernetes.io/projected/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-kube-api-access-95dxv\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.230889 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-dns-swift-storage-0\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.230912 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-dns-svc\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.230941 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.231034 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.231058 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-config\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.231806 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.232012 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-dns-svc\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.232179 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-config\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.232560 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.233089 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.241287 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-dns-swift-storage-0\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.252684 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95dxv\" (UniqueName: \"kubernetes.io/projected/fcc8f52b-776d-4a49-b62f-bf73fcc35fe0-kube-api-access-95dxv\") pod \"dnsmasq-dns-5d5cf5b645-f87gr\" (UID: \"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0\") " pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.332994 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.369804 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.386893 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.386899 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"06d8269b-d001-4d31-8397-28abccde6858","Type":"ContainerDied","Data":"14cb5f605bc25180655e053f25ef87acfe23a9d152353530c6a48db2719587e0"} Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.387054 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14cb5f605bc25180655e053f25ef87acfe23a9d152353530c6a48db2719587e0" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.390685 4693 generic.go:334] "Generic (PLEG): container finished" podID="757ab9ac-deb9-4efe-b2fc-5331d4314c0b" containerID="6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f" exitCode=0 Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.390726 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" event={"ID":"757ab9ac-deb9-4efe-b2fc-5331d4314c0b","Type":"ContainerDied","Data":"6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f"} Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.390759 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" event={"ID":"757ab9ac-deb9-4efe-b2fc-5331d4314c0b","Type":"ContainerDied","Data":"28a66205a79cb1684e0e0284d1009087f6bbbaf86dd4d7a38503f90e0dd7ad1c"} Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.390775 4693 scope.go:117] "RemoveContainer" containerID="6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.390836 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55bfb77665-bmc5s" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.422003 4693 scope.go:117] "RemoveContainer" containerID="69787debcbd631e4c56d5e7ff1f1681f00d82f912e7e4367ed575c1831c2d6b8" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.434354 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-ovsdbserver-sb\") pod \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.434525 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-config\") pod \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.434599 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-dns-svc\") pod \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.434684 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-dns-swift-storage-0\") pod \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.434737 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gdvw\" (UniqueName: \"kubernetes.io/projected/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-kube-api-access-9gdvw\") pod \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.434776 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-ovsdbserver-nb\") pod \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\" (UID: \"757ab9ac-deb9-4efe-b2fc-5331d4314c0b\") " Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.439524 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-kube-api-access-9gdvw" (OuterVolumeSpecName: "kube-api-access-9gdvw") pod "757ab9ac-deb9-4efe-b2fc-5331d4314c0b" (UID: "757ab9ac-deb9-4efe-b2fc-5331d4314c0b"). InnerVolumeSpecName "kube-api-access-9gdvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.448023 4693 scope.go:117] "RemoveContainer" containerID="6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f" Nov 25 12:31:11 crc kubenswrapper[4693]: E1125 12:31:11.448536 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f\": container with ID starting with 6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f not found: ID does not exist" containerID="6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.448574 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f"} err="failed to get container status \"6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f\": rpc error: code = NotFound desc = could not find container \"6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f\": container with ID starting with 6a9efd9c18dcbd6c33b4cec1ac510f4f36f4b2d6781d197cbdf71b8088369f5f not found: ID does not exist" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.448600 4693 scope.go:117] "RemoveContainer" containerID="69787debcbd631e4c56d5e7ff1f1681f00d82f912e7e4367ed575c1831c2d6b8" Nov 25 12:31:11 crc kubenswrapper[4693]: E1125 12:31:11.448900 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69787debcbd631e4c56d5e7ff1f1681f00d82f912e7e4367ed575c1831c2d6b8\": container with ID starting with 69787debcbd631e4c56d5e7ff1f1681f00d82f912e7e4367ed575c1831c2d6b8 not found: ID does not exist" containerID="69787debcbd631e4c56d5e7ff1f1681f00d82f912e7e4367ed575c1831c2d6b8" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.448947 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69787debcbd631e4c56d5e7ff1f1681f00d82f912e7e4367ed575c1831c2d6b8"} err="failed to get container status \"69787debcbd631e4c56d5e7ff1f1681f00d82f912e7e4367ed575c1831c2d6b8\": rpc error: code = NotFound desc = could not find container \"69787debcbd631e4c56d5e7ff1f1681f00d82f912e7e4367ed575c1831c2d6b8\": container with ID starting with 69787debcbd631e4c56d5e7ff1f1681f00d82f912e7e4367ed575c1831c2d6b8 not found: ID does not exist" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.509703 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "757ab9ac-deb9-4efe-b2fc-5331d4314c0b" (UID: "757ab9ac-deb9-4efe-b2fc-5331d4314c0b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.510592 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "757ab9ac-deb9-4efe-b2fc-5331d4314c0b" (UID: "757ab9ac-deb9-4efe-b2fc-5331d4314c0b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.510989 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-config" (OuterVolumeSpecName: "config") pod "757ab9ac-deb9-4efe-b2fc-5331d4314c0b" (UID: "757ab9ac-deb9-4efe-b2fc-5331d4314c0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.518066 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "757ab9ac-deb9-4efe-b2fc-5331d4314c0b" (UID: "757ab9ac-deb9-4efe-b2fc-5331d4314c0b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.533018 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "757ab9ac-deb9-4efe-b2fc-5331d4314c0b" (UID: "757ab9ac-deb9-4efe-b2fc-5331d4314c0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.537346 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.537381 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.537397 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.537410 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gdvw\" (UniqueName: \"kubernetes.io/projected/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-kube-api-access-9gdvw\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.537420 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.537428 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757ab9ac-deb9-4efe-b2fc-5331d4314c0b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.726669 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55bfb77665-bmc5s"] Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.738498 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55bfb77665-bmc5s"] Nov 25 12:31:11 crc kubenswrapper[4693]: I1125 12:31:11.873100 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d5cf5b645-f87gr"] Nov 25 12:31:11 crc kubenswrapper[4693]: W1125 12:31:11.882585 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcc8f52b_776d_4a49_b62f_bf73fcc35fe0.slice/crio-00aac371f5b7bc551202a3d3d6ba1284c5a5c9562684e6fbe7ead688b5f3a865 WatchSource:0}: Error finding container 00aac371f5b7bc551202a3d3d6ba1284c5a5c9562684e6fbe7ead688b5f3a865: Status 404 returned error can't find the container with id 00aac371f5b7bc551202a3d3d6ba1284c5a5c9562684e6fbe7ead688b5f3a865 Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.007576 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 25 12:31:12 crc kubenswrapper[4693]: E1125 12:31:12.008232 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757ab9ac-deb9-4efe-b2fc-5331d4314c0b" containerName="init" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.008318 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="757ab9ac-deb9-4efe-b2fc-5331d4314c0b" containerName="init" Nov 25 12:31:12 crc kubenswrapper[4693]: E1125 12:31:12.008422 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757ab9ac-deb9-4efe-b2fc-5331d4314c0b" containerName="dnsmasq-dns" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.008608 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="757ab9ac-deb9-4efe-b2fc-5331d4314c0b" containerName="dnsmasq-dns" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.008885 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="757ab9ac-deb9-4efe-b2fc-5331d4314c0b" containerName="dnsmasq-dns" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.009698 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.012235 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.013050 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.016931 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.147839 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa32523e-ff2b-4ce4-90a6-533c59472054-kube-api-access\") pod \"installer-9-crc\" (UID: \"fa32523e-ff2b-4ce4-90a6-533c59472054\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.148331 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa32523e-ff2b-4ce4-90a6-533c59472054-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fa32523e-ff2b-4ce4-90a6-533c59472054\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.148472 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa32523e-ff2b-4ce4-90a6-533c59472054-var-lock\") pod \"installer-9-crc\" (UID: \"fa32523e-ff2b-4ce4-90a6-533c59472054\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.250585 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa32523e-ff2b-4ce4-90a6-533c59472054-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fa32523e-ff2b-4ce4-90a6-533c59472054\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.250944 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa32523e-ff2b-4ce4-90a6-533c59472054-var-lock\") pod \"installer-9-crc\" (UID: \"fa32523e-ff2b-4ce4-90a6-533c59472054\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.251072 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa32523e-ff2b-4ce4-90a6-533c59472054-kube-api-access\") pod \"installer-9-crc\" (UID: \"fa32523e-ff2b-4ce4-90a6-533c59472054\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.250995 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa32523e-ff2b-4ce4-90a6-533c59472054-var-lock\") pod \"installer-9-crc\" (UID: \"fa32523e-ff2b-4ce4-90a6-533c59472054\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.250688 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa32523e-ff2b-4ce4-90a6-533c59472054-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fa32523e-ff2b-4ce4-90a6-533c59472054\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.267670 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa32523e-ff2b-4ce4-90a6-533c59472054-kube-api-access\") pod \"installer-9-crc\" (UID: \"fa32523e-ff2b-4ce4-90a6-533c59472054\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.367552 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.402012 4693 generic.go:334] "Generic (PLEG): container finished" podID="fcc8f52b-776d-4a49-b62f-bf73fcc35fe0" containerID="240ea197008c68a12f801148c9723d9d5b48ede649c75e1fa33d8f876db8ac82" exitCode=0 Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.402079 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" event={"ID":"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0","Type":"ContainerDied","Data":"240ea197008c68a12f801148c9723d9d5b48ede649c75e1fa33d8f876db8ac82"} Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.402484 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" event={"ID":"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0","Type":"ContainerStarted","Data":"00aac371f5b7bc551202a3d3d6ba1284c5a5c9562684e6fbe7ead688b5f3a865"} Nov 25 12:31:12 crc kubenswrapper[4693]: W1125 12:31:12.819342 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfa32523e_ff2b_4ce4_90a6_533c59472054.slice/crio-cf728b92e3a19fb366593f6aedf55d0b44f7578c807f1768549fe13c78f5c0ac WatchSource:0}: Error finding container cf728b92e3a19fb366593f6aedf55d0b44f7578c807f1768549fe13c78f5c0ac: Status 404 returned error can't find the container with id cf728b92e3a19fb366593f6aedf55d0b44f7578c807f1768549fe13c78f5c0ac Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.825502 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757ab9ac-deb9-4efe-b2fc-5331d4314c0b" path="/var/lib/kubelet/pods/757ab9ac-deb9-4efe-b2fc-5331d4314c0b/volumes" Nov 25 12:31:12 crc kubenswrapper[4693]: I1125 12:31:12.826314 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 25 12:31:13 crc kubenswrapper[4693]: I1125 12:31:13.428066 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fa32523e-ff2b-4ce4-90a6-533c59472054","Type":"ContainerStarted","Data":"b277c0adc92074b611ced6516a0d62de5c5b1a61acdb97c6ae5464ec5deacf41"} Nov 25 12:31:13 crc kubenswrapper[4693]: I1125 12:31:13.428483 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fa32523e-ff2b-4ce4-90a6-533c59472054","Type":"ContainerStarted","Data":"cf728b92e3a19fb366593f6aedf55d0b44f7578c807f1768549fe13c78f5c0ac"} Nov 25 12:31:13 crc kubenswrapper[4693]: I1125 12:31:13.471878 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.471859461 podStartE2EDuration="2.471859461s" podCreationTimestamp="2025-11-25 12:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:31:13.454532428 +0000 UTC m=+1393.372617829" watchObservedRunningTime="2025-11-25 12:31:13.471859461 +0000 UTC m=+1393.389944842" Nov 25 12:31:13 crc kubenswrapper[4693]: I1125 12:31:13.489692 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" event={"ID":"fcc8f52b-776d-4a49-b62f-bf73fcc35fe0","Type":"ContainerStarted","Data":"b484e5a9a722f08a53c9ade099eec61287fdf914d681c87f14cefb0702440fba"} Nov 25 12:31:13 crc kubenswrapper[4693]: I1125 12:31:13.490785 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:13 crc kubenswrapper[4693]: I1125 12:31:13.529171 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" podStartSLOduration=3.529150458 podStartE2EDuration="3.529150458s" podCreationTimestamp="2025-11-25 12:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:31:13.524640912 +0000 UTC m=+1393.442726293" watchObservedRunningTime="2025-11-25 12:31:13.529150458 +0000 UTC m=+1393.447235839" Nov 25 12:31:21 crc kubenswrapper[4693]: I1125 12:31:21.371720 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d5cf5b645-f87gr" Nov 25 12:31:21 crc kubenswrapper[4693]: I1125 12:31:21.438724 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f7944d86c-v9q9x"] Nov 25 12:31:21 crc kubenswrapper[4693]: I1125 12:31:21.438991 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" podUID="abe9905e-f88c-4e8b-b2fd-798bfca37283" containerName="dnsmasq-dns" containerID="cri-o://0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e" gracePeriod=10 Nov 25 12:31:21 crc kubenswrapper[4693]: I1125 12:31:21.946734 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.045699 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp5p4\" (UniqueName: \"kubernetes.io/projected/abe9905e-f88c-4e8b-b2fd-798bfca37283-kube-api-access-dp5p4\") pod \"abe9905e-f88c-4e8b-b2fd-798bfca37283\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.045914 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-ovsdbserver-sb\") pod \"abe9905e-f88c-4e8b-b2fd-798bfca37283\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.045936 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-dns-svc\") pod \"abe9905e-f88c-4e8b-b2fd-798bfca37283\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.045952 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-config\") pod \"abe9905e-f88c-4e8b-b2fd-798bfca37283\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.045972 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-dns-swift-storage-0\") pod \"abe9905e-f88c-4e8b-b2fd-798bfca37283\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.046021 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-ovsdbserver-nb\") pod \"abe9905e-f88c-4e8b-b2fd-798bfca37283\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.046085 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-openstack-edpm-ipam\") pod \"abe9905e-f88c-4e8b-b2fd-798bfca37283\" (UID: \"abe9905e-f88c-4e8b-b2fd-798bfca37283\") " Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.062756 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe9905e-f88c-4e8b-b2fd-798bfca37283-kube-api-access-dp5p4" (OuterVolumeSpecName: "kube-api-access-dp5p4") pod "abe9905e-f88c-4e8b-b2fd-798bfca37283" (UID: "abe9905e-f88c-4e8b-b2fd-798bfca37283"). InnerVolumeSpecName "kube-api-access-dp5p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.105185 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "abe9905e-f88c-4e8b-b2fd-798bfca37283" (UID: "abe9905e-f88c-4e8b-b2fd-798bfca37283"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.113301 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "abe9905e-f88c-4e8b-b2fd-798bfca37283" (UID: "abe9905e-f88c-4e8b-b2fd-798bfca37283"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.125806 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-config" (OuterVolumeSpecName: "config") pod "abe9905e-f88c-4e8b-b2fd-798bfca37283" (UID: "abe9905e-f88c-4e8b-b2fd-798bfca37283"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.125866 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "abe9905e-f88c-4e8b-b2fd-798bfca37283" (UID: "abe9905e-f88c-4e8b-b2fd-798bfca37283"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.132479 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "abe9905e-f88c-4e8b-b2fd-798bfca37283" (UID: "abe9905e-f88c-4e8b-b2fd-798bfca37283"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.133810 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "abe9905e-f88c-4e8b-b2fd-798bfca37283" (UID: "abe9905e-f88c-4e8b-b2fd-798bfca37283"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.149097 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.149146 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp5p4\" (UniqueName: \"kubernetes.io/projected/abe9905e-f88c-4e8b-b2fd-798bfca37283-kube-api-access-dp5p4\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.149166 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.149182 4693 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-config\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.149195 4693 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.149207 4693 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.149218 4693 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/abe9905e-f88c-4e8b-b2fd-798bfca37283-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.574051 4693 generic.go:334] "Generic (PLEG): container finished" podID="abe9905e-f88c-4e8b-b2fd-798bfca37283" containerID="0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e" exitCode=0 Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.574097 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.574119 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" event={"ID":"abe9905e-f88c-4e8b-b2fd-798bfca37283","Type":"ContainerDied","Data":"0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e"} Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.574161 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f7944d86c-v9q9x" event={"ID":"abe9905e-f88c-4e8b-b2fd-798bfca37283","Type":"ContainerDied","Data":"83901dce7d9f9ed3fdeb67866a9b7a698e56aadceeed0e6b7d10008b6745bc78"} Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.574182 4693 scope.go:117] "RemoveContainer" containerID="0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.611748 4693 scope.go:117] "RemoveContainer" containerID="d299682dc8a00512d0cc1e7eb1ce24a4ca77c69348854de49ac08ddc961b9c72" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.617433 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f7944d86c-v9q9x"] Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.628568 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f7944d86c-v9q9x"] Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.629927 4693 scope.go:117] "RemoveContainer" containerID="0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e" Nov 25 12:31:22 crc kubenswrapper[4693]: E1125 12:31:22.630625 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e\": container with ID starting with 0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e not found: ID does not exist" containerID="0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.630683 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e"} err="failed to get container status \"0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e\": rpc error: code = NotFound desc = could not find container \"0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e\": container with ID starting with 0037871c3052ebc4abb591fe026aa04a735891504efecef71fe6ed95a3eab11e not found: ID does not exist" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.630721 4693 scope.go:117] "RemoveContainer" containerID="d299682dc8a00512d0cc1e7eb1ce24a4ca77c69348854de49ac08ddc961b9c72" Nov 25 12:31:22 crc kubenswrapper[4693]: E1125 12:31:22.631025 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d299682dc8a00512d0cc1e7eb1ce24a4ca77c69348854de49ac08ddc961b9c72\": container with ID starting with d299682dc8a00512d0cc1e7eb1ce24a4ca77c69348854de49ac08ddc961b9c72 not found: ID does not exist" containerID="d299682dc8a00512d0cc1e7eb1ce24a4ca77c69348854de49ac08ddc961b9c72" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.631048 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d299682dc8a00512d0cc1e7eb1ce24a4ca77c69348854de49ac08ddc961b9c72"} err="failed to get container status \"d299682dc8a00512d0cc1e7eb1ce24a4ca77c69348854de49ac08ddc961b9c72\": rpc error: code = NotFound desc = could not find container \"d299682dc8a00512d0cc1e7eb1ce24a4ca77c69348854de49ac08ddc961b9c72\": container with ID starting with d299682dc8a00512d0cc1e7eb1ce24a4ca77c69348854de49ac08ddc961b9c72 not found: ID does not exist" Nov 25 12:31:22 crc kubenswrapper[4693]: I1125 12:31:22.825697 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe9905e-f88c-4e8b-b2fd-798bfca37283" path="/var/lib/kubelet/pods/abe9905e-f88c-4e8b-b2fd-798bfca37283/volumes" Nov 25 12:31:33 crc kubenswrapper[4693]: I1125 12:31:33.688708 4693 generic.go:334] "Generic (PLEG): container finished" podID="98b0bc68-9551-407d-8390-66688e8255d3" containerID="8a1fecf27d297f4d25dce1e39735b3e18f615af26f438b33f90e5ade63e538e8" exitCode=0 Nov 25 12:31:33 crc kubenswrapper[4693]: I1125 12:31:33.688799 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"98b0bc68-9551-407d-8390-66688e8255d3","Type":"ContainerDied","Data":"8a1fecf27d297f4d25dce1e39735b3e18f615af26f438b33f90e5ade63e538e8"} Nov 25 12:31:33 crc kubenswrapper[4693]: I1125 12:31:33.692490 4693 generic.go:334] "Generic (PLEG): container finished" podID="5c73e56b-c0f3-4d6d-9e33-26fe0d552e24" containerID="258d50467e079aa3932a783ecc9db8e11b003cc4080dcedd485643b40ef63df0" exitCode=0 Nov 25 12:31:33 crc kubenswrapper[4693]: I1125 12:31:33.692536 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24","Type":"ContainerDied","Data":"258d50467e079aa3932a783ecc9db8e11b003cc4080dcedd485643b40ef63df0"} Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.682514 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k"] Nov 25 12:31:34 crc kubenswrapper[4693]: E1125 12:31:34.683569 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe9905e-f88c-4e8b-b2fd-798bfca37283" containerName="dnsmasq-dns" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.683587 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe9905e-f88c-4e8b-b2fd-798bfca37283" containerName="dnsmasq-dns" Nov 25 12:31:34 crc kubenswrapper[4693]: E1125 12:31:34.683602 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe9905e-f88c-4e8b-b2fd-798bfca37283" containerName="init" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.683609 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe9905e-f88c-4e8b-b2fd-798bfca37283" containerName="init" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.683854 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe9905e-f88c-4e8b-b2fd-798bfca37283" containerName="dnsmasq-dns" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.684713 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.693825 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k"] Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.693920 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.694106 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.694133 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.699067 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.726619 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5c73e56b-c0f3-4d6d-9e33-26fe0d552e24","Type":"ContainerStarted","Data":"b6ec89ba5bd40c5f736be877bb21f9bda68048cf8d4d09b98ab4d1f1175dd082"} Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.727888 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.732970 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"98b0bc68-9551-407d-8390-66688e8255d3","Type":"ContainerStarted","Data":"38f1262df7f7be052d6844f2818a0853cef3c7c8ae7310a672cb980ba4b1ce5b"} Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.733755 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.773401 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.773366746 podStartE2EDuration="36.773366746s" podCreationTimestamp="2025-11-25 12:30:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:31:34.750049796 +0000 UTC m=+1414.668135187" watchObservedRunningTime="2025-11-25 12:31:34.773366746 +0000 UTC m=+1414.691452127" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.777339 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.777323466 podStartE2EDuration="36.777323466s" podCreationTimestamp="2025-11-25 12:30:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:31:34.770802854 +0000 UTC m=+1414.688888245" watchObservedRunningTime="2025-11-25 12:31:34.777323466 +0000 UTC m=+1414.695408847" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.783018 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.783075 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.783124 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.783249 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqmml\" (UniqueName: \"kubernetes.io/projected/3f8577c4-f507-4e40-b284-66d57b0aee3d-kube-api-access-wqmml\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.884961 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqmml\" (UniqueName: \"kubernetes.io/projected/3f8577c4-f507-4e40-b284-66d57b0aee3d-kube-api-access-wqmml\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.885139 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.885191 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.885305 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.893118 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.893848 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.897906 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:34 crc kubenswrapper[4693]: I1125 12:31:34.903923 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqmml\" (UniqueName: \"kubernetes.io/projected/3f8577c4-f507-4e40-b284-66d57b0aee3d-kube-api-access-wqmml\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:35 crc kubenswrapper[4693]: I1125 12:31:35.023042 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:31:35 crc kubenswrapper[4693]: I1125 12:31:35.113858 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:31:35 crc kubenswrapper[4693]: I1125 12:31:35.113954 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:31:35 crc kubenswrapper[4693]: I1125 12:31:35.114041 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:31:35 crc kubenswrapper[4693]: I1125 12:31:35.115688 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce4776c622bc7e46d7d568ae624b5c3426e9dfd4bd443fa89113683ec10d405f"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 12:31:35 crc kubenswrapper[4693]: I1125 12:31:35.115787 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://ce4776c622bc7e46d7d568ae624b5c3426e9dfd4bd443fa89113683ec10d405f" gracePeriod=600 Nov 25 12:31:35 crc kubenswrapper[4693]: W1125 12:31:35.648692 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f8577c4_f507_4e40_b284_66d57b0aee3d.slice/crio-0c6292d656a5ccb562926ca6ff194b461856370674a12ed89f6a3f61f9703ecd WatchSource:0}: Error finding container 0c6292d656a5ccb562926ca6ff194b461856370674a12ed89f6a3f61f9703ecd: Status 404 returned error can't find the container with id 0c6292d656a5ccb562926ca6ff194b461856370674a12ed89f6a3f61f9703ecd Nov 25 12:31:35 crc kubenswrapper[4693]: I1125 12:31:35.652179 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k"] Nov 25 12:31:35 crc kubenswrapper[4693]: I1125 12:31:35.742785 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" event={"ID":"3f8577c4-f507-4e40-b284-66d57b0aee3d","Type":"ContainerStarted","Data":"0c6292d656a5ccb562926ca6ff194b461856370674a12ed89f6a3f61f9703ecd"} Nov 25 12:31:35 crc kubenswrapper[4693]: I1125 12:31:35.745849 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="ce4776c622bc7e46d7d568ae624b5c3426e9dfd4bd443fa89113683ec10d405f" exitCode=0 Nov 25 12:31:35 crc kubenswrapper[4693]: I1125 12:31:35.746030 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"ce4776c622bc7e46d7d568ae624b5c3426e9dfd4bd443fa89113683ec10d405f"} Nov 25 12:31:35 crc kubenswrapper[4693]: I1125 12:31:35.746071 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349"} Nov 25 12:31:35 crc kubenswrapper[4693]: I1125 12:31:35.746090 4693 scope.go:117] "RemoveContainer" containerID="245f737f203c8007cd386e41d5f986e5bdb4a5f145f31a6ec9ef66e36fb73a9f" Nov 25 12:31:47 crc kubenswrapper[4693]: I1125 12:31:47.893312 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" event={"ID":"3f8577c4-f507-4e40-b284-66d57b0aee3d","Type":"ContainerStarted","Data":"00250ccaada6786ded04045169aa91c30106c25350a2d7b9c8ab4f662be2c6c7"} Nov 25 12:31:47 crc kubenswrapper[4693]: I1125 12:31:47.913688 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" podStartSLOduration=2.865613219 podStartE2EDuration="13.913663525s" podCreationTimestamp="2025-11-25 12:31:34 +0000 UTC" firstStartedPulling="2025-11-25 12:31:35.650936308 +0000 UTC m=+1415.569021689" lastFinishedPulling="2025-11-25 12:31:46.698986604 +0000 UTC m=+1426.617071995" observedRunningTime="2025-11-25 12:31:47.906996352 +0000 UTC m=+1427.825081733" watchObservedRunningTime="2025-11-25 12:31:47.913663525 +0000 UTC m=+1427.831748906" Nov 25 12:31:48 crc kubenswrapper[4693]: I1125 12:31:48.913597 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 25 12:31:48 crc kubenswrapper[4693]: I1125 12:31:48.968583 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.935115 4693 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 12:31:50 crc kubenswrapper[4693]: E1125 12:31:50.935684 4693 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.936632 4693 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.936880 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.937072 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c" gracePeriod=15 Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.937281 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535" gracePeriod=15 Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.937328 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7" gracePeriod=15 Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.937418 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f" gracePeriod=15 Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.937482 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f" gracePeriod=15 Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.937532 4693 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 12:31:50 crc kubenswrapper[4693]: E1125 12:31:50.938186 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.938266 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 25 12:31:50 crc kubenswrapper[4693]: E1125 12:31:50.938338 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.939849 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 25 12:31:50 crc kubenswrapper[4693]: E1125 12:31:50.939939 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.940015 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 12:31:50 crc kubenswrapper[4693]: E1125 12:31:50.940094 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.940168 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 25 12:31:50 crc kubenswrapper[4693]: E1125 12:31:50.940240 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.940314 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 25 12:31:50 crc kubenswrapper[4693]: E1125 12:31:50.940446 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.940548 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 12:31:50 crc kubenswrapper[4693]: E1125 12:31:50.940637 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.940712 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.941085 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.941171 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.941246 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.941349 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.941459 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 25 12:31:50 crc kubenswrapper[4693]: I1125 12:31:50.941949 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.026304 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.030696 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.030763 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.030861 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.030941 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.031130 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.031221 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.031242 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.135906 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136036 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136087 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136108 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136180 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136294 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136331 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136385 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136485 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136536 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136568 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136597 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136625 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136651 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136681 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.136713 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.915063 4693 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.915135 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.932341 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.933803 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.934532 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535" exitCode=0 Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.934634 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f" exitCode=0 Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.934700 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7" exitCode=0 Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.934757 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f" exitCode=2 Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.934901 4693 scope.go:117] "RemoveContainer" containerID="44dc8d0132b6202488e1baadcb380103e55664c686e8bb137e5a4a1261537162" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.937055 4693 generic.go:334] "Generic (PLEG): container finished" podID="fa32523e-ff2b-4ce4-90a6-533c59472054" containerID="b277c0adc92074b611ced6516a0d62de5c5b1a61acdb97c6ae5464ec5deacf41" exitCode=0 Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.937098 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fa32523e-ff2b-4ce4-90a6-533c59472054","Type":"ContainerDied","Data":"b277c0adc92074b611ced6516a0d62de5c5b1a61acdb97c6ae5464ec5deacf41"} Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.938127 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:51 crc kubenswrapper[4693]: I1125 12:31:51.938410 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:52 crc kubenswrapper[4693]: I1125 12:31:52.446292 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="ee5b4281-3cdb-4bad-8002-8520136232a4" containerName="kube-state-metrics" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 25 12:31:52 crc kubenswrapper[4693]: E1125 12:31:52.447043 4693 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-state-metrics-0.187b3fe07ce9c831 openstack 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openstack,Name:kube-state-metrics-0,UID:ee5b4281-3cdb-4bad-8002-8520136232a4,APIVersion:v1,ResourceVersion:44660,FieldPath:spec.containers{kube-state-metrics},},Reason:Unhealthy,Message:Liveness probe failed: HTTP probe failed with statuscode: 503,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 12:31:52.446601265 +0000 UTC m=+1432.364686636,LastTimestamp:2025-11-25 12:31:52.446601265 +0000 UTC m=+1432.364686636,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 12:31:52 crc kubenswrapper[4693]: I1125 12:31:52.950158 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.410694 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.411820 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.491206 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa32523e-ff2b-4ce4-90a6-533c59472054-kubelet-dir\") pod \"fa32523e-ff2b-4ce4-90a6-533c59472054\" (UID: \"fa32523e-ff2b-4ce4-90a6-533c59472054\") " Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.491360 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa32523e-ff2b-4ce4-90a6-533c59472054-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fa32523e-ff2b-4ce4-90a6-533c59472054" (UID: "fa32523e-ff2b-4ce4-90a6-533c59472054"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.491454 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa32523e-ff2b-4ce4-90a6-533c59472054-var-lock\") pod \"fa32523e-ff2b-4ce4-90a6-533c59472054\" (UID: \"fa32523e-ff2b-4ce4-90a6-533c59472054\") " Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.491534 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa32523e-ff2b-4ce4-90a6-533c59472054-var-lock" (OuterVolumeSpecName: "var-lock") pod "fa32523e-ff2b-4ce4-90a6-533c59472054" (UID: "fa32523e-ff2b-4ce4-90a6-533c59472054"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.491569 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa32523e-ff2b-4ce4-90a6-533c59472054-kube-api-access\") pod \"fa32523e-ff2b-4ce4-90a6-533c59472054\" (UID: \"fa32523e-ff2b-4ce4-90a6-533c59472054\") " Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.492051 4693 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa32523e-ff2b-4ce4-90a6-533c59472054-var-lock\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.492086 4693 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa32523e-ff2b-4ce4-90a6-533c59472054-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.499130 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa32523e-ff2b-4ce4-90a6-533c59472054-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fa32523e-ff2b-4ce4-90a6-533c59472054" (UID: "fa32523e-ff2b-4ce4-90a6-533c59472054"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.594255 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa32523e-ff2b-4ce4-90a6-533c59472054-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.665726 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.666459 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.667129 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.667440 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.797540 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.797635 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.797726 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.797748 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.797872 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.797974 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.798290 4693 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.798310 4693 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.798319 4693 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.967890 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.967891 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fa32523e-ff2b-4ce4-90a6-533c59472054","Type":"ContainerDied","Data":"cf728b92e3a19fb366593f6aedf55d0b44f7578c807f1768549fe13c78f5c0ac"} Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.968050 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf728b92e3a19fb366593f6aedf55d0b44f7578c807f1768549fe13c78f5c0ac" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.972088 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.973576 4693 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c" exitCode=0 Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.973637 4693 scope.go:117] "RemoveContainer" containerID="6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.973740 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.990210 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.990466 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.990932 4693 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:53 crc kubenswrapper[4693]: I1125 12:31:53.991170 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.009153 4693 scope.go:117] "RemoveContainer" containerID="6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.031881 4693 scope.go:117] "RemoveContainer" containerID="33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.050500 4693 scope.go:117] "RemoveContainer" containerID="6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.069855 4693 scope.go:117] "RemoveContainer" containerID="853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.094605 4693 scope.go:117] "RemoveContainer" containerID="1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.137172 4693 scope.go:117] "RemoveContainer" containerID="6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535" Nov 25 12:31:54 crc kubenswrapper[4693]: E1125 12:31:54.137887 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\": container with ID starting with 6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535 not found: ID does not exist" containerID="6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.137944 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535"} err="failed to get container status \"6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\": rpc error: code = NotFound desc = could not find container \"6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535\": container with ID starting with 6095f94cf70ea2079c7f10be93326e750271f2b5373ea4dabc0aff801aaad535 not found: ID does not exist" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.137982 4693 scope.go:117] "RemoveContainer" containerID="6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f" Nov 25 12:31:54 crc kubenswrapper[4693]: E1125 12:31:54.141968 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\": container with ID starting with 6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f not found: ID does not exist" containerID="6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.142008 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f"} err="failed to get container status \"6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\": rpc error: code = NotFound desc = could not find container \"6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f\": container with ID starting with 6537d07697f846d74b9c2325e7a87251cbd625d698389c965bd19c0c625c7f5f not found: ID does not exist" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.142039 4693 scope.go:117] "RemoveContainer" containerID="33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7" Nov 25 12:31:54 crc kubenswrapper[4693]: E1125 12:31:54.142634 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\": container with ID starting with 33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7 not found: ID does not exist" containerID="33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.142658 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7"} err="failed to get container status \"33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\": rpc error: code = NotFound desc = could not find container \"33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7\": container with ID starting with 33716fd5839c922735b3d71fb7737798bfb04aad48f47643da4e25ca3df75dd7 not found: ID does not exist" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.142672 4693 scope.go:117] "RemoveContainer" containerID="6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f" Nov 25 12:31:54 crc kubenswrapper[4693]: E1125 12:31:54.143517 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\": container with ID starting with 6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f not found: ID does not exist" containerID="6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.143541 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f"} err="failed to get container status \"6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\": rpc error: code = NotFound desc = could not find container \"6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f\": container with ID starting with 6babda5834fa7bdb6a55ab29a54eba2806c55e64ee1fed05582e1f43fad76c8f not found: ID does not exist" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.143562 4693 scope.go:117] "RemoveContainer" containerID="853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c" Nov 25 12:31:54 crc kubenswrapper[4693]: E1125 12:31:54.144589 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\": container with ID starting with 853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c not found: ID does not exist" containerID="853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.144622 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c"} err="failed to get container status \"853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\": rpc error: code = NotFound desc = could not find container \"853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c\": container with ID starting with 853fe3be310bd42714bbecc5f12f3dab06cbcfe62ae451337bc13c5027e0d32c not found: ID does not exist" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.144644 4693 scope.go:117] "RemoveContainer" containerID="1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce" Nov 25 12:31:54 crc kubenswrapper[4693]: E1125 12:31:54.145028 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\": container with ID starting with 1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce not found: ID does not exist" containerID="1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.145055 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce"} err="failed to get container status \"1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\": rpc error: code = NotFound desc = could not find container \"1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce\": container with ID starting with 1d9c8d2204f11634af09ed237f6ff34c9a87001a81a520e7350883278d4ca3ce not found: ID does not exist" Nov 25 12:31:54 crc kubenswrapper[4693]: I1125 12:31:54.825633 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 25 12:31:55 crc kubenswrapper[4693]: E1125 12:31:55.995883 4693 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:55 crc kubenswrapper[4693]: I1125 12:31:55.997062 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:56 crc kubenswrapper[4693]: W1125 12:31:56.042272 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-b0e48b00a66cdba76bc53e96dd0ba41d093081323b9ed9003406bf73869fcd2b WatchSource:0}: Error finding container b0e48b00a66cdba76bc53e96dd0ba41d093081323b9ed9003406bf73869fcd2b: Status 404 returned error can't find the container with id b0e48b00a66cdba76bc53e96dd0ba41d093081323b9ed9003406bf73869fcd2b Nov 25 12:31:57 crc kubenswrapper[4693]: I1125 12:31:57.003962 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"54d9593aaff2d252f0e0e14da995e7c4ab8a2275156e2adb42bb9be847ad1448"} Nov 25 12:31:57 crc kubenswrapper[4693]: I1125 12:31:57.004248 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b0e48b00a66cdba76bc53e96dd0ba41d093081323b9ed9003406bf73869fcd2b"} Nov 25 12:31:57 crc kubenswrapper[4693]: E1125 12:31:57.004815 4693 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:31:57 crc kubenswrapper[4693]: I1125 12:31:57.005220 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:57 crc kubenswrapper[4693]: E1125 12:31:57.797805 4693 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:57 crc kubenswrapper[4693]: E1125 12:31:57.798573 4693 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:57 crc kubenswrapper[4693]: E1125 12:31:57.799084 4693 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:57 crc kubenswrapper[4693]: E1125 12:31:57.799459 4693 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:57 crc kubenswrapper[4693]: E1125 12:31:57.799822 4693 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:57 crc kubenswrapper[4693]: I1125 12:31:57.799852 4693 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 25 12:31:57 crc kubenswrapper[4693]: E1125 12:31:57.800090 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="200ms" Nov 25 12:31:58 crc kubenswrapper[4693]: E1125 12:31:58.001279 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="400ms" Nov 25 12:31:58 crc kubenswrapper[4693]: I1125 12:31:58.015695 4693 generic.go:334] "Generic (PLEG): container finished" podID="3f8577c4-f507-4e40-b284-66d57b0aee3d" containerID="00250ccaada6786ded04045169aa91c30106c25350a2d7b9c8ab4f662be2c6c7" exitCode=0 Nov 25 12:31:58 crc kubenswrapper[4693]: I1125 12:31:58.015745 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" event={"ID":"3f8577c4-f507-4e40-b284-66d57b0aee3d","Type":"ContainerDied","Data":"00250ccaada6786ded04045169aa91c30106c25350a2d7b9c8ab4f662be2c6c7"} Nov 25 12:31:58 crc kubenswrapper[4693]: I1125 12:31:58.016611 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:58 crc kubenswrapper[4693]: I1125 12:31:58.017146 4693 status_manager.go:851] "Failed to get status for pod" podUID="3f8577c4-f507-4e40-b284-66d57b0aee3d" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack/pods/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:58 crc kubenswrapper[4693]: E1125 12:31:58.402643 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="800ms" Nov 25 12:31:58 crc kubenswrapper[4693]: E1125 12:31:58.771953 4693 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-state-metrics-0.187b3fe07ce9c831 openstack 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openstack,Name:kube-state-metrics-0,UID:ee5b4281-3cdb-4bad-8002-8520136232a4,APIVersion:v1,ResourceVersion:44660,FieldPath:spec.containers{kube-state-metrics},},Reason:Unhealthy,Message:Liveness probe failed: HTTP probe failed with statuscode: 503,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-25 12:31:52.446601265 +0000 UTC m=+1432.364686636,LastTimestamp:2025-11-25 12:31:52.446601265 +0000 UTC m=+1432.364686636,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 25 12:31:59 crc kubenswrapper[4693]: E1125 12:31:59.154781 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:31:59Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:31:59Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:31:59Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-25T12:31:59Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:59 crc kubenswrapper[4693]: E1125 12:31:59.155332 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:59 crc kubenswrapper[4693]: E1125 12:31:59.155669 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:59 crc kubenswrapper[4693]: E1125 12:31:59.155896 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:59 crc kubenswrapper[4693]: E1125 12:31:59.156120 4693 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:31:59 crc kubenswrapper[4693]: E1125 12:31:59.156152 4693 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 25 12:31:59 crc kubenswrapper[4693]: E1125 12:31:59.203262 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="1.6s" Nov 25 12:32:00 crc kubenswrapper[4693]: E1125 12:32:00.803906 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="3.2s" Nov 25 12:32:00 crc kubenswrapper[4693]: I1125 12:32:00.827076 4693 status_manager.go:851] "Failed to get status for pod" podUID="3f8577c4-f507-4e40-b284-66d57b0aee3d" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack/pods/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:00 crc kubenswrapper[4693]: I1125 12:32:00.827610 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:02 crc kubenswrapper[4693]: I1125 12:32:02.054350 4693 generic.go:334] "Generic (PLEG): container finished" podID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" containerID="18c41683de3a4d0c38a6284ceb31c5e0f2f2a57df60a13a64d6b95c56a8faa33" exitCode=1 Nov 25 12:32:02 crc kubenswrapper[4693]: I1125 12:32:02.054480 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" event={"ID":"0d2b9e6f-fe11-47e3-af7b-cca0fff65798","Type":"ContainerDied","Data":"18c41683de3a4d0c38a6284ceb31c5e0f2f2a57df60a13a64d6b95c56a8faa33"} Nov 25 12:32:02 crc kubenswrapper[4693]: I1125 12:32:02.055531 4693 scope.go:117] "RemoveContainer" containerID="18c41683de3a4d0c38a6284ceb31c5e0f2f2a57df60a13a64d6b95c56a8faa33" Nov 25 12:32:02 crc kubenswrapper[4693]: I1125 12:32:02.058002 4693 status_manager.go:851] "Failed to get status for pod" podUID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5995bbfc5f-c8gkc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:02 crc kubenswrapper[4693]: I1125 12:32:02.058452 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:02 crc kubenswrapper[4693]: I1125 12:32:02.058938 4693 status_manager.go:851] "Failed to get status for pod" podUID="3f8577c4-f507-4e40-b284-66d57b0aee3d" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack/pods/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:02 crc kubenswrapper[4693]: I1125 12:32:02.448215 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="ee5b4281-3cdb-4bad-8002-8520136232a4" containerName="kube-state-metrics" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 25 12:32:03 crc kubenswrapper[4693]: I1125 12:32:03.066991 4693 generic.go:334] "Generic (PLEG): container finished" podID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" containerID="9276d800f34d5243ede48bbfd8f32cad496ae6ac720cacf0b6ce0acc117dae10" exitCode=1 Nov 25 12:32:03 crc kubenswrapper[4693]: I1125 12:32:03.067089 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" event={"ID":"0d2b9e6f-fe11-47e3-af7b-cca0fff65798","Type":"ContainerDied","Data":"9276d800f34d5243ede48bbfd8f32cad496ae6ac720cacf0b6ce0acc117dae10"} Nov 25 12:32:03 crc kubenswrapper[4693]: I1125 12:32:03.067264 4693 scope.go:117] "RemoveContainer" containerID="18c41683de3a4d0c38a6284ceb31c5e0f2f2a57df60a13a64d6b95c56a8faa33" Nov 25 12:32:03 crc kubenswrapper[4693]: I1125 12:32:03.068020 4693 scope.go:117] "RemoveContainer" containerID="9276d800f34d5243ede48bbfd8f32cad496ae6ac720cacf0b6ce0acc117dae10" Nov 25 12:32:03 crc kubenswrapper[4693]: I1125 12:32:03.068159 4693 status_manager.go:851] "Failed to get status for pod" podUID="3f8577c4-f507-4e40-b284-66d57b0aee3d" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack/pods/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:03 crc kubenswrapper[4693]: E1125 12:32:03.068283 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=metallb-operator-controller-manager-5995bbfc5f-c8gkc_metallb-system(0d2b9e6f-fe11-47e3-af7b-cca0fff65798)\"" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" podUID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" Nov 25 12:32:03 crc kubenswrapper[4693]: I1125 12:32:03.068712 4693 status_manager.go:851] "Failed to get status for pod" podUID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5995bbfc5f-c8gkc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:03 crc kubenswrapper[4693]: I1125 12:32:03.069342 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:03 crc kubenswrapper[4693]: I1125 12:32:03.934708 4693 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Nov 25 12:32:03 crc kubenswrapper[4693]: I1125 12:32:03.935080 4693 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Nov 25 12:32:04 crc kubenswrapper[4693]: E1125 12:32:04.005664 4693 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="6.4s" Nov 25 12:32:04 crc kubenswrapper[4693]: I1125 12:32:04.084074 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 25 12:32:04 crc kubenswrapper[4693]: I1125 12:32:04.084139 4693 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9" exitCode=1 Nov 25 12:32:04 crc kubenswrapper[4693]: I1125 12:32:04.084175 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9"} Nov 25 12:32:04 crc kubenswrapper[4693]: I1125 12:32:04.084930 4693 scope.go:117] "RemoveContainer" containerID="de9932a8aa0bef9db99a8d7370e112747701ee466205c717ed9b7f6d48766ee9" Nov 25 12:32:04 crc kubenswrapper[4693]: I1125 12:32:04.085167 4693 status_manager.go:851] "Failed to get status for pod" podUID="3f8577c4-f507-4e40-b284-66d57b0aee3d" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack/pods/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:04 crc kubenswrapper[4693]: I1125 12:32:04.086436 4693 status_manager.go:851] "Failed to get status for pod" podUID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5995bbfc5f-c8gkc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:04 crc kubenswrapper[4693]: I1125 12:32:04.086772 4693 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:04 crc kubenswrapper[4693]: I1125 12:32:04.087240 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.096720 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.097685 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7ed050d32db32780c13f82b1f53245a4afdd954160aa5b065b1985ffec5b8d58"} Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.098916 4693 status_manager.go:851] "Failed to get status for pod" podUID="3f8577c4-f507-4e40-b284-66d57b0aee3d" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack/pods/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.099616 4693 status_manager.go:851] "Failed to get status for pod" podUID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5995bbfc5f-c8gkc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.100098 4693 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.100560 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.812583 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.813763 4693 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.814337 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.814898 4693 status_manager.go:851] "Failed to get status for pod" podUID="3f8577c4-f507-4e40-b284-66d57b0aee3d" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack/pods/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.815516 4693 status_manager.go:851] "Failed to get status for pod" podUID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5995bbfc5f-c8gkc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.830946 4693 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c154f2a-1272-4355-9c90-4ba1ac6b7118" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.832130 4693 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c154f2a-1272-4355-9c90-4ba1ac6b7118" Nov 25 12:32:05 crc kubenswrapper[4693]: E1125 12:32:05.832744 4693 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:32:05 crc kubenswrapper[4693]: I1125 12:32:05.833800 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:32:05 crc kubenswrapper[4693]: W1125 12:32:05.874558 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-a29026901b218cc6f58cae10a82c9a6af1aa72a880c570556dc4cc5268f4845c WatchSource:0}: Error finding container a29026901b218cc6f58cae10a82c9a6af1aa72a880c570556dc4cc5268f4845c: Status 404 returned error can't find the container with id a29026901b218cc6f58cae10a82c9a6af1aa72a880c570556dc4cc5268f4845c Nov 25 12:32:06 crc kubenswrapper[4693]: I1125 12:32:06.109247 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a29026901b218cc6f58cae10a82c9a6af1aa72a880c570556dc4cc5268f4845c"} Nov 25 12:32:07 crc kubenswrapper[4693]: I1125 12:32:07.121054 4693 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="472cbf3b6bc92faddfe05e9511d638dcae3b8ec0468fee2c832436894bdb800e" exitCode=0 Nov 25 12:32:07 crc kubenswrapper[4693]: I1125 12:32:07.121093 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"472cbf3b6bc92faddfe05e9511d638dcae3b8ec0468fee2c832436894bdb800e"} Nov 25 12:32:07 crc kubenswrapper[4693]: I1125 12:32:07.121390 4693 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c154f2a-1272-4355-9c90-4ba1ac6b7118" Nov 25 12:32:07 crc kubenswrapper[4693]: I1125 12:32:07.121412 4693 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c154f2a-1272-4355-9c90-4ba1ac6b7118" Nov 25 12:32:07 crc kubenswrapper[4693]: E1125 12:32:07.121888 4693 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:32:07 crc kubenswrapper[4693]: I1125 12:32:07.121992 4693 status_manager.go:851] "Failed to get status for pod" podUID="3f8577c4-f507-4e40-b284-66d57b0aee3d" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openstack/pods/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:07 crc kubenswrapper[4693]: I1125 12:32:07.122559 4693 status_manager.go:851] "Failed to get status for pod" podUID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/metallb-system/pods/metallb-operator-controller-manager-5995bbfc5f-c8gkc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:07 crc kubenswrapper[4693]: I1125 12:32:07.123809 4693 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:07 crc kubenswrapper[4693]: I1125 12:32:07.124084 4693 status_manager.go:851] "Failed to get status for pod" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Nov 25 12:32:07 crc kubenswrapper[4693]: I1125 12:32:07.704507 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:32:07 crc kubenswrapper[4693]: I1125 12:32:07.705636 4693 scope.go:117] "RemoveContainer" containerID="9276d800f34d5243ede48bbfd8f32cad496ae6ac720cacf0b6ce0acc117dae10" Nov 25 12:32:07 crc kubenswrapper[4693]: E1125 12:32:07.709415 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=metallb-operator-controller-manager-5995bbfc5f-c8gkc_metallb-system(0d2b9e6f-fe11-47e3-af7b-cca0fff65798)\"" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" podUID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" Nov 25 12:32:08 crc kubenswrapper[4693]: I1125 12:32:08.133338 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"99e6b04c0291d66233876953a12548fd35718a6fcc6e5d07719160016a02e09d"} Nov 25 12:32:08 crc kubenswrapper[4693]: I1125 12:32:08.133652 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f11b9c811cce1e1253f2012ffb6e06e2a866198026c890cd426a8bb96c6f6700"} Nov 25 12:32:08 crc kubenswrapper[4693]: I1125 12:32:08.133663 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"663f2b24dc64873c527dbe128e7d82fc9fa0d7e1c945c39b2e9d9e2451b2e8f3"} Nov 25 12:32:09 crc kubenswrapper[4693]: I1125 12:32:09.142913 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e7a0e9f343cd4da5b5eaf221b8dea700fdd94f261eb69330a790ae9bb5dbce12"} Nov 25 12:32:09 crc kubenswrapper[4693]: I1125 12:32:09.143185 4693 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c154f2a-1272-4355-9c90-4ba1ac6b7118" Nov 25 12:32:09 crc kubenswrapper[4693]: I1125 12:32:09.143203 4693 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c154f2a-1272-4355-9c90-4ba1ac6b7118" Nov 25 12:32:09 crc kubenswrapper[4693]: I1125 12:32:09.143201 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"aed11280f45acaf15f263d5cdf3f8b8a2ed8166af80afcaaaaf15ae5225fc31a"} Nov 25 12:32:09 crc kubenswrapper[4693]: I1125 12:32:09.143290 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:32:10 crc kubenswrapper[4693]: I1125 12:32:10.834669 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:32:10 crc kubenswrapper[4693]: I1125 12:32:10.835195 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:32:10 crc kubenswrapper[4693]: I1125 12:32:10.841050 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:32:11 crc kubenswrapper[4693]: I1125 12:32:11.163286 4693 generic.go:334] "Generic (PLEG): container finished" podID="c3a7c8cb-ac3c-43d3-b38d-0c3625c53196" containerID="5cc961f810e4c360ff7afa41b8fd4044ac91c3a4062a28a1ef8eb78c708dd041" exitCode=1 Nov 25 12:32:11 crc kubenswrapper[4693]: I1125 12:32:11.163352 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" event={"ID":"c3a7c8cb-ac3c-43d3-b38d-0c3625c53196","Type":"ContainerDied","Data":"5cc961f810e4c360ff7afa41b8fd4044ac91c3a4062a28a1ef8eb78c708dd041"} Nov 25 12:32:11 crc kubenswrapper[4693]: I1125 12:32:11.164354 4693 scope.go:117] "RemoveContainer" containerID="5cc961f810e4c360ff7afa41b8fd4044ac91c3a4062a28a1ef8eb78c708dd041" Nov 25 12:32:12 crc kubenswrapper[4693]: I1125 12:32:12.173878 4693 generic.go:334] "Generic (PLEG): container finished" podID="c3a7c8cb-ac3c-43d3-b38d-0c3625c53196" containerID="3398f0e87aacd1d0644fb32e02b5520ecc324408aad1d8f9e4117a34cd6e9623" exitCode=1 Nov 25 12:32:12 crc kubenswrapper[4693]: I1125 12:32:12.173976 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" event={"ID":"c3a7c8cb-ac3c-43d3-b38d-0c3625c53196","Type":"ContainerDied","Data":"3398f0e87aacd1d0644fb32e02b5520ecc324408aad1d8f9e4117a34cd6e9623"} Nov 25 12:32:12 crc kubenswrapper[4693]: I1125 12:32:12.174415 4693 scope.go:117] "RemoveContainer" containerID="5cc961f810e4c360ff7afa41b8fd4044ac91c3a4062a28a1ef8eb78c708dd041" Nov 25 12:32:12 crc kubenswrapper[4693]: I1125 12:32:12.175207 4693 scope.go:117] "RemoveContainer" containerID="3398f0e87aacd1d0644fb32e02b5520ecc324408aad1d8f9e4117a34cd6e9623" Nov 25 12:32:12 crc kubenswrapper[4693]: E1125 12:32:12.175495 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=swift-operator-controller-manager-6fdc4fcf86-bnf27_openstack-operators(c3a7c8cb-ac3c-43d3-b38d-0c3625c53196)\"" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" podUID="c3a7c8cb-ac3c-43d3-b38d-0c3625c53196" Nov 25 12:32:12 crc kubenswrapper[4693]: I1125 12:32:12.446537 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="ee5b4281-3cdb-4bad-8002-8520136232a4" containerName="kube-state-metrics" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 25 12:32:12 crc kubenswrapper[4693]: I1125 12:32:12.446617 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/kube-state-metrics-0" Nov 25 12:32:12 crc kubenswrapper[4693]: I1125 12:32:12.447326 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-state-metrics" containerStatusID={"Type":"cri-o","ID":"d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425"} pod="openstack/kube-state-metrics-0" containerMessage="Container kube-state-metrics failed liveness probe, will be restarted" Nov 25 12:32:12 crc kubenswrapper[4693]: I1125 12:32:12.447407 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ee5b4281-3cdb-4bad-8002-8520136232a4" containerName="kube-state-metrics" containerID="cri-o://d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425" gracePeriod=30 Nov 25 12:32:13 crc kubenswrapper[4693]: I1125 12:32:13.190397 4693 generic.go:334] "Generic (PLEG): container finished" podID="ee5b4281-3cdb-4bad-8002-8520136232a4" containerID="d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425" exitCode=2 Nov 25 12:32:13 crc kubenswrapper[4693]: I1125 12:32:13.190676 4693 generic.go:334] "Generic (PLEG): container finished" podID="ee5b4281-3cdb-4bad-8002-8520136232a4" containerID="6cdb78f307ddbd2a927cdf209003aad4c13e00bec555f10892b052e37711da6c" exitCode=1 Nov 25 12:32:13 crc kubenswrapper[4693]: I1125 12:32:13.190692 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee5b4281-3cdb-4bad-8002-8520136232a4","Type":"ContainerDied","Data":"d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425"} Nov 25 12:32:13 crc kubenswrapper[4693]: I1125 12:32:13.190716 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee5b4281-3cdb-4bad-8002-8520136232a4","Type":"ContainerDied","Data":"6cdb78f307ddbd2a927cdf209003aad4c13e00bec555f10892b052e37711da6c"} Nov 25 12:32:13 crc kubenswrapper[4693]: I1125 12:32:13.190736 4693 scope.go:117] "RemoveContainer" containerID="d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425" Nov 25 12:32:13 crc kubenswrapper[4693]: I1125 12:32:13.191326 4693 scope.go:117] "RemoveContainer" containerID="6cdb78f307ddbd2a927cdf209003aad4c13e00bec555f10892b052e37711da6c" Nov 25 12:32:13 crc kubenswrapper[4693]: I1125 12:32:13.231074 4693 scope.go:117] "RemoveContainer" containerID="d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425" Nov 25 12:32:13 crc kubenswrapper[4693]: E1125 12:32:13.231671 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425\": container with ID starting with d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425 not found: ID does not exist" containerID="d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425" Nov 25 12:32:13 crc kubenswrapper[4693]: I1125 12:32:13.231727 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425"} err="failed to get container status \"d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425\": rpc error: code = NotFound desc = could not find container \"d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425\": container with ID starting with d27b7b5a212ddc5e6603eef3106355cc8664a2ea073cc1c2aeb32f0f2e906425 not found: ID does not exist" Nov 25 12:32:13 crc kubenswrapper[4693]: I1125 12:32:13.394433 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:32:13 crc kubenswrapper[4693]: I1125 12:32:13.398589 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:32:13 crc kubenswrapper[4693]: I1125 12:32:13.933951 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.155361 4693 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.158832 4693 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5c154f2a-1272-4355-9c90-4ba1ac6b7118\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:32:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:32:07Z\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-25T12:32:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://663f2b24dc64873c527dbe128e7d82fc9fa0d7e1c945c39b2e9d9e2451b2e8f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99e6b04c0291d66233876953a12548fd35718a6fcc6e5d07719160016a02e09d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f11b9c811cce1e1253f2012ffb6e06e2a866198026c890cd426a8bb96c6f6700\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:32:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7a0e9f343cd4da5b5eaf221b8dea700fdd94f261eb69330a790ae9bb5dbce12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:32:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aed11280f45acaf15f263d5cdf3f8b8a2ed8166af80afcaaaaf15ae5225fc31a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-25T12:32:08Z\\\"}}}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://472cbf3b6bc92faddfe05e9511d638dcae3b8ec0468fee2c832436894bdb800e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://472cbf3b6bc92faddfe05e9511d638dcae3b8ec0468fee2c832436894bdb800e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-25T12:32:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-25T12:32:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}]}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": pods \"kube-apiserver-crc\" not found" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.201703 4693 generic.go:334] "Generic (PLEG): container finished" podID="ef0b302b-05d0-4be3-85ad-7eb3d70cec36" containerID="f4bff34b34e94d1610d7a484673bcf1170a3be8c9a7208ea7be1f78b05452b45" exitCode=1 Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.201786 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" event={"ID":"ef0b302b-05d0-4be3-85ad-7eb3d70cec36","Type":"ContainerDied","Data":"f4bff34b34e94d1610d7a484673bcf1170a3be8c9a7208ea7be1f78b05452b45"} Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.202442 4693 scope.go:117] "RemoveContainer" containerID="f4bff34b34e94d1610d7a484673bcf1170a3be8c9a7208ea7be1f78b05452b45" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.205345 4693 generic.go:334] "Generic (PLEG): container finished" podID="105791fd-407d-44a3-8fc8-af90e82b0f63" containerID="12d86e8aa72882509063f1cd74a6d24c85786c9a11199879e8818192e9d52fff" exitCode=1 Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.205478 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" event={"ID":"105791fd-407d-44a3-8fc8-af90e82b0f63","Type":"ContainerDied","Data":"12d86e8aa72882509063f1cd74a6d24c85786c9a11199879e8818192e9d52fff"} Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.206219 4693 scope.go:117] "RemoveContainer" containerID="12d86e8aa72882509063f1cd74a6d24c85786c9a11199879e8818192e9d52fff" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.209119 4693 generic.go:334] "Generic (PLEG): container finished" podID="a7c4eb9b-38af-41da-872e-b3da515b2f88" containerID="81f51a4757df61cde05d7103c5afa05ab33ddd62b2f9070c92b333b26afb71d9" exitCode=1 Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.209188 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" event={"ID":"a7c4eb9b-38af-41da-872e-b3da515b2f88","Type":"ContainerDied","Data":"81f51a4757df61cde05d7103c5afa05ab33ddd62b2f9070c92b333b26afb71d9"} Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.209693 4693 scope.go:117] "RemoveContainer" containerID="81f51a4757df61cde05d7103c5afa05ab33ddd62b2f9070c92b333b26afb71d9" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.216795 4693 generic.go:334] "Generic (PLEG): container finished" podID="ebf85cb6-2651-4b5f-9cbe-973db55e14c5" containerID="3c51e8afbc12b65d0a94a15851f512a69a6009e5bec43aa4991a150caf98c439" exitCode=1 Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.216882 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" event={"ID":"ebf85cb6-2651-4b5f-9cbe-973db55e14c5","Type":"ContainerDied","Data":"3c51e8afbc12b65d0a94a15851f512a69a6009e5bec43aa4991a150caf98c439"} Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.217721 4693 scope.go:117] "RemoveContainer" containerID="3c51e8afbc12b65d0a94a15851f512a69a6009e5bec43aa4991a150caf98c439" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.238293 4693 generic.go:334] "Generic (PLEG): container finished" podID="ee5b4281-3cdb-4bad-8002-8520136232a4" containerID="cad48c1e5032e0c7214de93640ad44a9ae54028f19294125b41d143c22c68223" exitCode=1 Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.238429 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee5b4281-3cdb-4bad-8002-8520136232a4","Type":"ContainerDied","Data":"cad48c1e5032e0c7214de93640ad44a9ae54028f19294125b41d143c22c68223"} Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.238494 4693 scope.go:117] "RemoveContainer" containerID="6cdb78f307ddbd2a927cdf209003aad4c13e00bec555f10892b052e37711da6c" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.239402 4693 scope.go:117] "RemoveContainer" containerID="cad48c1e5032e0c7214de93640ad44a9ae54028f19294125b41d143c22c68223" Nov 25 12:32:14 crc kubenswrapper[4693]: E1125 12:32:14.239747 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-state-metrics pod=kube-state-metrics-0_openstack(ee5b4281-3cdb-4bad-8002-8520136232a4)\"" pod="openstack/kube-state-metrics-0" podUID="ee5b4281-3cdb-4bad-8002-8520136232a4" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.255485 4693 generic.go:334] "Generic (PLEG): container finished" podID="f6bc1c64-200f-492f-bad9-dfecd5687698" containerID="81ab856e9d9d58631859ef51fb4795f49c2b4aa5144303ea9f8681d5edf9bd3e" exitCode=1 Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.255684 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" event={"ID":"f6bc1c64-200f-492f-bad9-dfecd5687698","Type":"ContainerDied","Data":"81ab856e9d9d58631859ef51fb4795f49c2b4aa5144303ea9f8681d5edf9bd3e"} Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.256514 4693 scope.go:117] "RemoveContainer" containerID="81ab856e9d9d58631859ef51fb4795f49c2b4aa5144303ea9f8681d5edf9bd3e" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.260006 4693 generic.go:334] "Generic (PLEG): container finished" podID="4ab70f55-282f-4509-bc36-71ef2fe4d35b" containerID="5b4de767d2aad21d15a4460a613fb046c8e38ea7b03c41a5db8cf688493820c2" exitCode=1 Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.260044 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" event={"ID":"4ab70f55-282f-4509-bc36-71ef2fe4d35b","Type":"ContainerDied","Data":"5b4de767d2aad21d15a4460a613fb046c8e38ea7b03c41a5db8cf688493820c2"} Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.260330 4693 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c154f2a-1272-4355-9c90-4ba1ac6b7118" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.260349 4693 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c154f2a-1272-4355-9c90-4ba1ac6b7118" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.260861 4693 scope.go:117] "RemoveContainer" containerID="5b4de767d2aad21d15a4460a613fb046c8e38ea7b03c41a5db8cf688493820c2" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.293012 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.305427 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.595124 4693 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5998035c-c234-4fc1-979c-8ce88dca4d50" Nov 25 12:32:14 crc kubenswrapper[4693]: I1125 12:32:14.622365 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.290711 4693 generic.go:334] "Generic (PLEG): container finished" podID="c80a0f65-6193-435f-8138-eb5a4ba71b22" containerID="0939e18bfe497776320fc5084e6673c9353b0aa99a7d5728bb9bfff0d248e5cb" exitCode=1 Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.290834 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" event={"ID":"c80a0f65-6193-435f-8138-eb5a4ba71b22","Type":"ContainerDied","Data":"0939e18bfe497776320fc5084e6673c9353b0aa99a7d5728bb9bfff0d248e5cb"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.292091 4693 scope.go:117] "RemoveContainer" containerID="0939e18bfe497776320fc5084e6673c9353b0aa99a7d5728bb9bfff0d248e5cb" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.297680 4693 generic.go:334] "Generic (PLEG): container finished" podID="105791fd-407d-44a3-8fc8-af90e82b0f63" containerID="389dec33bbdd099de036486f78b012d7fd9380277e97ee8d6f6bdfeb334c84f3" exitCode=1 Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.297780 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" event={"ID":"105791fd-407d-44a3-8fc8-af90e82b0f63","Type":"ContainerDied","Data":"389dec33bbdd099de036486f78b012d7fd9380277e97ee8d6f6bdfeb334c84f3"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.297817 4693 scope.go:117] "RemoveContainer" containerID="12d86e8aa72882509063f1cd74a6d24c85786c9a11199879e8818192e9d52fff" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.298557 4693 scope.go:117] "RemoveContainer" containerID="389dec33bbdd099de036486f78b012d7fd9380277e97ee8d6f6bdfeb334c84f3" Nov 25 12:32:15 crc kubenswrapper[4693]: E1125 12:32:15.298871 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=watcher-operator-controller-manager-864885998-tc9jb_openstack-operators(105791fd-407d-44a3-8fc8-af90e82b0f63)\"" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" podUID="105791fd-407d-44a3-8fc8-af90e82b0f63" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.303678 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" event={"ID":"a7c4eb9b-38af-41da-872e-b3da515b2f88","Type":"ContainerStarted","Data":"dfcdb950110d3a83b2f7dc00a42bee57cb5ff081af4b4d0e28055f9456ab6195"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.303981 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.305716 4693 generic.go:334] "Generic (PLEG): container finished" podID="1c7db975-17d7-48dd-8e5a-0549749ab866" containerID="4005ff4e6f0777123cda9b38521e0a98cf316b807919df05e588c2235579f4a9" exitCode=1 Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.305827 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" event={"ID":"1c7db975-17d7-48dd-8e5a-0549749ab866","Type":"ContainerDied","Data":"4005ff4e6f0777123cda9b38521e0a98cf316b807919df05e588c2235579f4a9"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.306616 4693 scope.go:117] "RemoveContainer" containerID="4005ff4e6f0777123cda9b38521e0a98cf316b807919df05e588c2235579f4a9" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.308229 4693 generic.go:334] "Generic (PLEG): container finished" podID="b9227546-dcce-4b09-9311-19f844deb318" containerID="d18ccdab540642b91a8050bb99fcb971d006c2fbaf87724cade89f3cfd6b5d2e" exitCode=1 Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.308295 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" event={"ID":"b9227546-dcce-4b09-9311-19f844deb318","Type":"ContainerDied","Data":"d18ccdab540642b91a8050bb99fcb971d006c2fbaf87724cade89f3cfd6b5d2e"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.308830 4693 scope.go:117] "RemoveContainer" containerID="d18ccdab540642b91a8050bb99fcb971d006c2fbaf87724cade89f3cfd6b5d2e" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.310814 4693 generic.go:334] "Generic (PLEG): container finished" podID="4ab70f55-282f-4509-bc36-71ef2fe4d35b" containerID="131d8b9e16e8795b8042d2ca09411459ec391c6df8a164a8c3d613b036f0ec81" exitCode=1 Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.310839 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" event={"ID":"4ab70f55-282f-4509-bc36-71ef2fe4d35b","Type":"ContainerDied","Data":"131d8b9e16e8795b8042d2ca09411459ec391c6df8a164a8c3d613b036f0ec81"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.311453 4693 scope.go:117] "RemoveContainer" containerID="131d8b9e16e8795b8042d2ca09411459ec391c6df8a164a8c3d613b036f0ec81" Nov 25 12:32:15 crc kubenswrapper[4693]: E1125 12:32:15.311711 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=cinder-operator-controller-manager-79856dc55c-4lt8v_openstack-operators(4ab70f55-282f-4509-bc36-71ef2fe4d35b)\"" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" podUID="4ab70f55-282f-4509-bc36-71ef2fe4d35b" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.314444 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" event={"ID":"ef0b302b-05d0-4be3-85ad-7eb3d70cec36","Type":"ContainerStarted","Data":"d0f438078eec22f8c0395eff0d588c801a0befb27fe1d3a73e9a1f1073161004"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.314641 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.316822 4693 generic.go:334] "Generic (PLEG): container finished" podID="fe2a0074-66dc-4730-9321-772ee8fd8e28" containerID="4c72935e8f3b0f44c87f61604309800764180e90f764c2095fb8d658245ef2aa" exitCode=1 Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.316905 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" event={"ID":"fe2a0074-66dc-4730-9321-772ee8fd8e28","Type":"ContainerDied","Data":"4c72935e8f3b0f44c87f61604309800764180e90f764c2095fb8d658245ef2aa"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.317556 4693 scope.go:117] "RemoveContainer" containerID="4c72935e8f3b0f44c87f61604309800764180e90f764c2095fb8d658245ef2aa" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.323101 4693 generic.go:334] "Generic (PLEG): container finished" podID="f6bc1c64-200f-492f-bad9-dfecd5687698" containerID="17de0afe28901c5936bb9c0db152ea67ba9b918bd046d5e4fc2e53e67af688ec" exitCode=1 Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.323269 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" event={"ID":"f6bc1c64-200f-492f-bad9-dfecd5687698","Type":"ContainerDied","Data":"17de0afe28901c5936bb9c0db152ea67ba9b918bd046d5e4fc2e53e67af688ec"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.324438 4693 scope.go:117] "RemoveContainer" containerID="17de0afe28901c5936bb9c0db152ea67ba9b918bd046d5e4fc2e53e67af688ec" Nov 25 12:32:15 crc kubenswrapper[4693]: E1125 12:32:15.324711 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=placement-operator-controller-manager-5db546f9d9-f4trp_openstack-operators(f6bc1c64-200f-492f-bad9-dfecd5687698)\"" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" podUID="f6bc1c64-200f-492f-bad9-dfecd5687698" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.334230 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" event={"ID":"3f8577c4-f507-4e40-b284-66d57b0aee3d","Type":"ContainerDied","Data":"0c6292d656a5ccb562926ca6ff194b461856370674a12ed89f6a3f61f9703ecd"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.334266 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c6292d656a5ccb562926ca6ff194b461856370674a12ed89f6a3f61f9703ecd" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.334279 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.336257 4693 generic.go:334] "Generic (PLEG): container finished" podID="0f35f544-581e-4cb2-900f-71213e27477d" containerID="4ea6ff7d5e76fde10f591e8024510800752b4480ce89adb7f4b46a5d8acf1e11" exitCode=1 Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.336320 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" event={"ID":"0f35f544-581e-4cb2-900f-71213e27477d","Type":"ContainerDied","Data":"4ea6ff7d5e76fde10f591e8024510800752b4480ce89adb7f4b46a5d8acf1e11"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.337151 4693 scope.go:117] "RemoveContainer" containerID="4ea6ff7d5e76fde10f591e8024510800752b4480ce89adb7f4b46a5d8acf1e11" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.338550 4693 generic.go:334] "Generic (PLEG): container finished" podID="3c29e8b9-57cf-4967-b5e2-a6af42c16099" containerID="526fd6415be6e0e16318eee3605a03a9f940d9cd41a789b9d17dab825cae7f64" exitCode=1 Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.338663 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" event={"ID":"3c29e8b9-57cf-4967-b5e2-a6af42c16099","Type":"ContainerDied","Data":"526fd6415be6e0e16318eee3605a03a9f940d9cd41a789b9d17dab825cae7f64"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.339100 4693 scope.go:117] "RemoveContainer" containerID="526fd6415be6e0e16318eee3605a03a9f940d9cd41a789b9d17dab825cae7f64" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.340776 4693 generic.go:334] "Generic (PLEG): container finished" podID="5c98082e-070e-42b1-afdc-69cea132629e" containerID="96d6732128d018481c9c57f8412853a6a18b9b0a5b93dba81049f56bfa6bb6ad" exitCode=1 Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.340831 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" event={"ID":"5c98082e-070e-42b1-afdc-69cea132629e","Type":"ContainerDied","Data":"96d6732128d018481c9c57f8412853a6a18b9b0a5b93dba81049f56bfa6bb6ad"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.341153 4693 scope.go:117] "RemoveContainer" containerID="96d6732128d018481c9c57f8412853a6a18b9b0a5b93dba81049f56bfa6bb6ad" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.347005 4693 generic.go:334] "Generic (PLEG): container finished" podID="b29c9c21-026a-4701-99a7-769d382a2da2" containerID="face5f24f67a4abbb0ec061572bb135258d901f43ebebe0ae7dfa2c8eee90cd7" exitCode=1 Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.347064 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" event={"ID":"b29c9c21-026a-4701-99a7-769d382a2da2","Type":"ContainerDied","Data":"face5f24f67a4abbb0ec061572bb135258d901f43ebebe0ae7dfa2c8eee90cd7"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.348965 4693 scope.go:117] "RemoveContainer" containerID="face5f24f67a4abbb0ec061572bb135258d901f43ebebe0ae7dfa2c8eee90cd7" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.350797 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" event={"ID":"ebf85cb6-2651-4b5f-9cbe-973db55e14c5","Type":"ContainerStarted","Data":"75735f7bdc7ac85b7de5350e6ae425362b1f7fe7d21aa6556e8bda6efc87e8ea"} Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.351237 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.353625 4693 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c154f2a-1272-4355-9c90-4ba1ac6b7118" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.353646 4693 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c154f2a-1272-4355-9c90-4ba1ac6b7118" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.560554 4693 scope.go:117] "RemoveContainer" containerID="5b4de767d2aad21d15a4460a613fb046c8e38ea7b03c41a5db8cf688493820c2" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.618829 4693 scope.go:117] "RemoveContainer" containerID="81ab856e9d9d58631859ef51fb4795f49c2b4aa5144303ea9f8681d5edf9bd3e" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.734305 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-ssh-key\") pod \"3f8577c4-f507-4e40-b284-66d57b0aee3d\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.734392 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqmml\" (UniqueName: \"kubernetes.io/projected/3f8577c4-f507-4e40-b284-66d57b0aee3d-kube-api-access-wqmml\") pod \"3f8577c4-f507-4e40-b284-66d57b0aee3d\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.734417 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-inventory\") pod \"3f8577c4-f507-4e40-b284-66d57b0aee3d\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.734459 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-repo-setup-combined-ca-bundle\") pod \"3f8577c4-f507-4e40-b284-66d57b0aee3d\" (UID: \"3f8577c4-f507-4e40-b284-66d57b0aee3d\") " Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.742076 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3f8577c4-f507-4e40-b284-66d57b0aee3d" (UID: "3f8577c4-f507-4e40-b284-66d57b0aee3d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.742131 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f8577c4-f507-4e40-b284-66d57b0aee3d-kube-api-access-wqmml" (OuterVolumeSpecName: "kube-api-access-wqmml") pod "3f8577c4-f507-4e40-b284-66d57b0aee3d" (UID: "3f8577c4-f507-4e40-b284-66d57b0aee3d"). InnerVolumeSpecName "kube-api-access-wqmml". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.784343 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-inventory" (OuterVolumeSpecName: "inventory") pod "3f8577c4-f507-4e40-b284-66d57b0aee3d" (UID: "3f8577c4-f507-4e40-b284-66d57b0aee3d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.832701 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3f8577c4-f507-4e40-b284-66d57b0aee3d" (UID: "3f8577c4-f507-4e40-b284-66d57b0aee3d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.838576 4693 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.838606 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.838620 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqmml\" (UniqueName: \"kubernetes.io/projected/3f8577c4-f507-4e40-b284-66d57b0aee3d-kube-api-access-wqmml\") on node \"crc\" DevicePath \"\"" Nov 25 12:32:15 crc kubenswrapper[4693]: I1125 12:32:15.838635 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8577c4-f507-4e40-b284-66d57b0aee3d-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.335214 4693 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5998035c-c234-4fc1-979c-8ce88dca4d50" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.363603 4693 generic.go:334] "Generic (PLEG): container finished" podID="fe2a0074-66dc-4730-9321-772ee8fd8e28" containerID="90154e0c65b206bd7f11b146252026bf49c29e8a600c5c7dc1ed554dd1cc4eca" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.363674 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" event={"ID":"fe2a0074-66dc-4730-9321-772ee8fd8e28","Type":"ContainerDied","Data":"90154e0c65b206bd7f11b146252026bf49c29e8a600c5c7dc1ed554dd1cc4eca"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.363737 4693 scope.go:117] "RemoveContainer" containerID="4c72935e8f3b0f44c87f61604309800764180e90f764c2095fb8d658245ef2aa" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.364602 4693 scope.go:117] "RemoveContainer" containerID="90154e0c65b206bd7f11b146252026bf49c29e8a600c5c7dc1ed554dd1cc4eca" Nov 25 12:32:16 crc kubenswrapper[4693]: E1125 12:32:16.365083 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=octavia-operator-controller-manager-fd75fd47d-g972v_openstack-operators(fe2a0074-66dc-4730-9321-772ee8fd8e28)\"" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" podUID="fe2a0074-66dc-4730-9321-772ee8fd8e28" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.374358 4693 generic.go:334] "Generic (PLEG): container finished" podID="5c98082e-070e-42b1-afdc-69cea132629e" containerID="27c5891a0eb8db55cb8f850039af58dc9697b7a9362fc7622192e963ab293556" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.374418 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" event={"ID":"5c98082e-070e-42b1-afdc-69cea132629e","Type":"ContainerDied","Data":"27c5891a0eb8db55cb8f850039af58dc9697b7a9362fc7622192e963ab293556"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.375353 4693 scope.go:117] "RemoveContainer" containerID="27c5891a0eb8db55cb8f850039af58dc9697b7a9362fc7622192e963ab293556" Nov 25 12:32:16 crc kubenswrapper[4693]: E1125 12:32:16.375778 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=infra-operator-controller-manager-d5cc86f4b-r86ct_openstack-operators(5c98082e-070e-42b1-afdc-69cea132629e)\"" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" podUID="5c98082e-070e-42b1-afdc-69cea132629e" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.377170 4693 generic.go:334] "Generic (PLEG): container finished" podID="c80a0f65-6193-435f-8138-eb5a4ba71b22" containerID="b01e14a06ec7f21819ed59ce624d425a1074b68d1e4626645e57833ac79e0413" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.377235 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" event={"ID":"c80a0f65-6193-435f-8138-eb5a4ba71b22","Type":"ContainerDied","Data":"b01e14a06ec7f21819ed59ce624d425a1074b68d1e4626645e57833ac79e0413"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.377881 4693 scope.go:117] "RemoveContainer" containerID="b01e14a06ec7f21819ed59ce624d425a1074b68d1e4626645e57833ac79e0413" Nov 25 12:32:16 crc kubenswrapper[4693]: E1125 12:32:16.378132 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=openstack-operator-controller-manager-7cd5954d9-rqjq9_openstack-operators(c80a0f65-6193-435f-8138-eb5a4ba71b22)\"" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" podUID="c80a0f65-6193-435f-8138-eb5a4ba71b22" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.380439 4693 generic.go:334] "Generic (PLEG): container finished" podID="bfeee7c1-207f-4862-b172-f2ffab4a1500" containerID="606ac57bd54bda616ef63549e79ade51118f68fc2004439f20bd3928b67d782c" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.380515 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" event={"ID":"bfeee7c1-207f-4862-b172-f2ffab4a1500","Type":"ContainerDied","Data":"606ac57bd54bda616ef63549e79ade51118f68fc2004439f20bd3928b67d782c"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.381516 4693 scope.go:117] "RemoveContainer" containerID="606ac57bd54bda616ef63549e79ade51118f68fc2004439f20bd3928b67d782c" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.386880 4693 generic.go:334] "Generic (PLEG): container finished" podID="b9227546-dcce-4b09-9311-19f844deb318" containerID="56023f15cd0a1e16d627b8e19cf69c999dbe8a18ef01ce954fe3403a4319e909" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.386963 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" event={"ID":"b9227546-dcce-4b09-9311-19f844deb318","Type":"ContainerDied","Data":"56023f15cd0a1e16d627b8e19cf69c999dbe8a18ef01ce954fe3403a4319e909"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.387819 4693 scope.go:117] "RemoveContainer" containerID="56023f15cd0a1e16d627b8e19cf69c999dbe8a18ef01ce954fe3403a4319e909" Nov 25 12:32:16 crc kubenswrapper[4693]: E1125 12:32:16.388259 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=telemetry-operator-controller-manager-567f98c9d-cwrvs_openstack-operators(b9227546-dcce-4b09-9311-19f844deb318)\"" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" podUID="b9227546-dcce-4b09-9311-19f844deb318" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.394214 4693 generic.go:334] "Generic (PLEG): container finished" podID="3c29e8b9-57cf-4967-b5e2-a6af42c16099" containerID="3e7456d5d3738801a4ed8914bfe561a50c956903bcf9bc7669432ec6f3e8d30c" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.394265 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" event={"ID":"3c29e8b9-57cf-4967-b5e2-a6af42c16099","Type":"ContainerDied","Data":"3e7456d5d3738801a4ed8914bfe561a50c956903bcf9bc7669432ec6f3e8d30c"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.394890 4693 scope.go:117] "RemoveContainer" containerID="3e7456d5d3738801a4ed8914bfe561a50c956903bcf9bc7669432ec6f3e8d30c" Nov 25 12:32:16 crc kubenswrapper[4693]: E1125 12:32:16.395123 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ironic-operator-controller-manager-5bfcdc958c-szrv4_openstack-operators(3c29e8b9-57cf-4967-b5e2-a6af42c16099)\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" podUID="3c29e8b9-57cf-4967-b5e2-a6af42c16099" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.398262 4693 generic.go:334] "Generic (PLEG): container finished" podID="4dd9cd53-1f66-4636-9fab-9f0b3ff38009" containerID="a8e22ea9c6297db7676d5003b03a7b98815c1563cb3cd4b959529c23feb4b068" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.398335 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" event={"ID":"4dd9cd53-1f66-4636-9fab-9f0b3ff38009","Type":"ContainerDied","Data":"a8e22ea9c6297db7676d5003b03a7b98815c1563cb3cd4b959529c23feb4b068"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.398867 4693 scope.go:117] "RemoveContainer" containerID="a8e22ea9c6297db7676d5003b03a7b98815c1563cb3cd4b959529c23feb4b068" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.401098 4693 generic.go:334] "Generic (PLEG): container finished" podID="22a83ecc-1f72-4474-a470-2ee4bef7eddf" containerID="68c3e5c6880f7109205001bca2c88ce5139e3c4f07e97bd9efc93eb7b54c536b" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.401172 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" event={"ID":"22a83ecc-1f72-4474-a470-2ee4bef7eddf","Type":"ContainerDied","Data":"68c3e5c6880f7109205001bca2c88ce5139e3c4f07e97bd9efc93eb7b54c536b"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.401613 4693 scope.go:117] "RemoveContainer" containerID="68c3e5c6880f7109205001bca2c88ce5139e3c4f07e97bd9efc93eb7b54c536b" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.403138 4693 generic.go:334] "Generic (PLEG): container finished" podID="2f11c884-15fc-4e2a-a533-d0eac0639f80" containerID="c133d166c5bdf16a1e28b10129f73de03ab48a6583ec62dd9c5aa2f8c72b92e6" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.403211 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" event={"ID":"2f11c884-15fc-4e2a-a533-d0eac0639f80","Type":"ContainerDied","Data":"c133d166c5bdf16a1e28b10129f73de03ab48a6583ec62dd9c5aa2f8c72b92e6"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.403869 4693 scope.go:117] "RemoveContainer" containerID="c133d166c5bdf16a1e28b10129f73de03ab48a6583ec62dd9c5aa2f8c72b92e6" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.407677 4693 generic.go:334] "Generic (PLEG): container finished" podID="0f35f544-581e-4cb2-900f-71213e27477d" containerID="132c7b926ee6482acad15da9715a9fed6619442a7e93dc27da6375a9d1f95082" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.407734 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" event={"ID":"0f35f544-581e-4cb2-900f-71213e27477d","Type":"ContainerDied","Data":"132c7b926ee6482acad15da9715a9fed6619442a7e93dc27da6375a9d1f95082"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.408160 4693 scope.go:117] "RemoveContainer" containerID="132c7b926ee6482acad15da9715a9fed6619442a7e93dc27da6375a9d1f95082" Nov 25 12:32:16 crc kubenswrapper[4693]: E1125 12:32:16.408478 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=neutron-operator-controller-manager-7c57c8bbc4-csrpt_openstack-operators(0f35f544-581e-4cb2-900f-71213e27477d)\"" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" podUID="0f35f544-581e-4cb2-900f-71213e27477d" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.424944 4693 generic.go:334] "Generic (PLEG): container finished" podID="9cc5c4a9-0119-48b6-a795-9f482b55278b" containerID="98f562f76f85fc439569aec3906d2175d30c8be267cc242d39a6d77c0ea98b82" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.425021 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" event={"ID":"9cc5c4a9-0119-48b6-a795-9f482b55278b","Type":"ContainerDied","Data":"98f562f76f85fc439569aec3906d2175d30c8be267cc242d39a6d77c0ea98b82"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.425616 4693 scope.go:117] "RemoveContainer" containerID="98f562f76f85fc439569aec3906d2175d30c8be267cc242d39a6d77c0ea98b82" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.434281 4693 generic.go:334] "Generic (PLEG): container finished" podID="a64b0f5c-e6af-4903-925a-028aec5477fd" containerID="eabf73ef1d8ddebd6d00e1e3b35c56f3423f2a0ec131aeeb95d9da26f0729f3e" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.434351 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" event={"ID":"a64b0f5c-e6af-4903-925a-028aec5477fd","Type":"ContainerDied","Data":"eabf73ef1d8ddebd6d00e1e3b35c56f3423f2a0ec131aeeb95d9da26f0729f3e"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.434910 4693 scope.go:117] "RemoveContainer" containerID="eabf73ef1d8ddebd6d00e1e3b35c56f3423f2a0ec131aeeb95d9da26f0729f3e" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.437074 4693 generic.go:334] "Generic (PLEG): container finished" podID="7ecc8c23-d9b2-4d46-a8b0-76758035b267" containerID="ddb9b505fe4a4467877b656fa5b5cdbbc8fd54bcf2d5d3a6ae2c4bd1f705d3a8" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.437136 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" event={"ID":"7ecc8c23-d9b2-4d46-a8b0-76758035b267","Type":"ContainerDied","Data":"ddb9b505fe4a4467877b656fa5b5cdbbc8fd54bcf2d5d3a6ae2c4bd1f705d3a8"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.437493 4693 scope.go:117] "RemoveContainer" containerID="ddb9b505fe4a4467877b656fa5b5cdbbc8fd54bcf2d5d3a6ae2c4bd1f705d3a8" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.443770 4693 generic.go:334] "Generic (PLEG): container finished" podID="7cb65a4e-3294-4104-b3bf-6d1103b92c38" containerID="04b9c09e42dc86f972d449310f260b434a79a4329b2f39c1dd3d390775a4a871" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.443967 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" event={"ID":"7cb65a4e-3294-4104-b3bf-6d1103b92c38","Type":"ContainerDied","Data":"04b9c09e42dc86f972d449310f260b434a79a4329b2f39c1dd3d390775a4a871"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.445092 4693 scope.go:117] "RemoveContainer" containerID="04b9c09e42dc86f972d449310f260b434a79a4329b2f39c1dd3d390775a4a871" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.448717 4693 generic.go:334] "Generic (PLEG): container finished" podID="1c7db975-17d7-48dd-8e5a-0549749ab866" containerID="40493dd1f7193de376cf060f08fbbdaf773d14634bbd5fdd8e992107dce4cc9c" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.448762 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" event={"ID":"1c7db975-17d7-48dd-8e5a-0549749ab866","Type":"ContainerDied","Data":"40493dd1f7193de376cf060f08fbbdaf773d14634bbd5fdd8e992107dce4cc9c"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.449100 4693 scope.go:117] "RemoveContainer" containerID="40493dd1f7193de376cf060f08fbbdaf773d14634bbd5fdd8e992107dce4cc9c" Nov 25 12:32:16 crc kubenswrapper[4693]: E1125 12:32:16.449301 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ovn-operator-controller-manager-66cf5c67ff-k2njb_openstack-operators(1c7db975-17d7-48dd-8e5a-0549749ab866)\"" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" podUID="1c7db975-17d7-48dd-8e5a-0549749ab866" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.450924 4693 scope.go:117] "RemoveContainer" containerID="96d6732128d018481c9c57f8412853a6a18b9b0a5b93dba81049f56bfa6bb6ad" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.452002 4693 generic.go:334] "Generic (PLEG): container finished" podID="28782f20-4534-4137-b590-7a3b31c638b2" containerID="7c126151898ffab73ef187f632f68d5495628bc12473d8a45d20ebf6aa04f3be" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.452063 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" event={"ID":"28782f20-4534-4137-b590-7a3b31c638b2","Type":"ContainerDied","Data":"7c126151898ffab73ef187f632f68d5495628bc12473d8a45d20ebf6aa04f3be"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.452472 4693 scope.go:117] "RemoveContainer" containerID="7c126151898ffab73ef187f632f68d5495628bc12473d8a45d20ebf6aa04f3be" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.463331 4693 generic.go:334] "Generic (PLEG): container finished" podID="b29c9c21-026a-4701-99a7-769d382a2da2" containerID="9591f3ac45e35952c205b0fa2e1f1b9950f2bc3dd784bb9ac266219b16b19065" exitCode=1 Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.463548 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" event={"ID":"b29c9c21-026a-4701-99a7-769d382a2da2","Type":"ContainerDied","Data":"9591f3ac45e35952c205b0fa2e1f1b9950f2bc3dd784bb9ac266219b16b19065"} Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.463958 4693 scope.go:117] "RemoveContainer" containerID="9591f3ac45e35952c205b0fa2e1f1b9950f2bc3dd784bb9ac266219b16b19065" Nov 25 12:32:16 crc kubenswrapper[4693]: E1125 12:32:16.464203 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=heat-operator-controller-manager-774b86978c-nzz29_openstack-operators(b29c9c21-026a-4701-99a7-769d382a2da2)\"" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" podUID="b29c9c21-026a-4701-99a7-769d382a2da2" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.649463 4693 scope.go:117] "RemoveContainer" containerID="0939e18bfe497776320fc5084e6673c9353b0aa99a7d5728bb9bfff0d248e5cb" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.848137 4693 scope.go:117] "RemoveContainer" containerID="d18ccdab540642b91a8050bb99fcb971d006c2fbaf87724cade89f3cfd6b5d2e" Nov 25 12:32:16 crc kubenswrapper[4693]: I1125 12:32:16.975009 4693 scope.go:117] "RemoveContainer" containerID="526fd6415be6e0e16318eee3605a03a9f940d9cd41a789b9d17dab825cae7f64" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.058452 4693 scope.go:117] "RemoveContainer" containerID="4ea6ff7d5e76fde10f591e8024510800752b4480ce89adb7f4b46a5d8acf1e11" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.116889 4693 scope.go:117] "RemoveContainer" containerID="4005ff4e6f0777123cda9b38521e0a98cf316b807919df05e588c2235579f4a9" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.146050 4693 scope.go:117] "RemoveContainer" containerID="face5f24f67a4abbb0ec061572bb135258d901f43ebebe0ae7dfa2c8eee90cd7" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.476217 4693 generic.go:334] "Generic (PLEG): container finished" podID="7ecc8c23-d9b2-4d46-a8b0-76758035b267" containerID="3f44b807f8c7da0ee9410ad2dfbb6c2020c6a817f2a6412ecf9abf9259dc183a" exitCode=1 Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.476278 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" event={"ID":"7ecc8c23-d9b2-4d46-a8b0-76758035b267","Type":"ContainerDied","Data":"3f44b807f8c7da0ee9410ad2dfbb6c2020c6a817f2a6412ecf9abf9259dc183a"} Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.476334 4693 scope.go:117] "RemoveContainer" containerID="ddb9b505fe4a4467877b656fa5b5cdbbc8fd54bcf2d5d3a6ae2c4bd1f705d3a8" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.477045 4693 scope.go:117] "RemoveContainer" containerID="3f44b807f8c7da0ee9410ad2dfbb6c2020c6a817f2a6412ecf9abf9259dc183a" Nov 25 12:32:17 crc kubenswrapper[4693]: E1125 12:32:17.477446 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=nova-operator-controller-manager-79556f57fc-flxdz_openstack-operators(7ecc8c23-d9b2-4d46-a8b0-76758035b267)\"" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" podUID="7ecc8c23-d9b2-4d46-a8b0-76758035b267" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.482651 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" event={"ID":"28782f20-4534-4137-b590-7a3b31c638b2","Type":"ContainerStarted","Data":"8db3ebb13898f8b25875dbef6b6608b580d63375fb3c4f8a0b9bae3b59f6156c"} Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.493182 4693 generic.go:334] "Generic (PLEG): container finished" podID="2f11c884-15fc-4e2a-a533-d0eac0639f80" containerID="06535766f329732625b6f90ef696ab32b44ec08c2e6f46ce58068be14e57d952" exitCode=1 Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.493262 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" event={"ID":"2f11c884-15fc-4e2a-a533-d0eac0639f80","Type":"ContainerDied","Data":"06535766f329732625b6f90ef696ab32b44ec08c2e6f46ce58068be14e57d952"} Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.494107 4693 scope.go:117] "RemoveContainer" containerID="06535766f329732625b6f90ef696ab32b44ec08c2e6f46ce58068be14e57d952" Nov 25 12:32:17 crc kubenswrapper[4693]: E1125 12:32:17.494609 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=barbican-operator-controller-manager-86dc4d89c8-6wxtj_openstack-operators(2f11c884-15fc-4e2a-a533-d0eac0639f80)\"" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" podUID="2f11c884-15fc-4e2a-a533-d0eac0639f80" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.495293 4693 generic.go:334] "Generic (PLEG): container finished" podID="4dd9cd53-1f66-4636-9fab-9f0b3ff38009" containerID="08262cc82f0084020fdb0b7bbef1a21def26efff477125af0d4aa8a3410b850b" exitCode=1 Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.495364 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" event={"ID":"4dd9cd53-1f66-4636-9fab-9f0b3ff38009","Type":"ContainerDied","Data":"08262cc82f0084020fdb0b7bbef1a21def26efff477125af0d4aa8a3410b850b"} Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.496013 4693 scope.go:117] "RemoveContainer" containerID="08262cc82f0084020fdb0b7bbef1a21def26efff477125af0d4aa8a3410b850b" Nov 25 12:32:17 crc kubenswrapper[4693]: E1125 12:32:17.496243 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=horizon-operator-controller-manager-68c9694994-fwwsj_openstack-operators(4dd9cd53-1f66-4636-9fab-9f0b3ff38009)\"" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" podUID="4dd9cd53-1f66-4636-9fab-9f0b3ff38009" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.499317 4693 generic.go:334] "Generic (PLEG): container finished" podID="22a83ecc-1f72-4474-a470-2ee4bef7eddf" containerID="f924c555e294ce79df52550a36a751d3db1c3720b8db55d8ae80a2dced02311b" exitCode=1 Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.499452 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" event={"ID":"22a83ecc-1f72-4474-a470-2ee4bef7eddf","Type":"ContainerDied","Data":"f924c555e294ce79df52550a36a751d3db1c3720b8db55d8ae80a2dced02311b"} Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.500206 4693 scope.go:117] "RemoveContainer" containerID="f924c555e294ce79df52550a36a751d3db1c3720b8db55d8ae80a2dced02311b" Nov 25 12:32:17 crc kubenswrapper[4693]: E1125 12:32:17.500547 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=mariadb-operator-controller-manager-cb6c4fdb7-s9shw_openstack-operators(22a83ecc-1f72-4474-a470-2ee4bef7eddf)\"" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" podUID="22a83ecc-1f72-4474-a470-2ee4bef7eddf" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.504050 4693 generic.go:334] "Generic (PLEG): container finished" podID="bfeee7c1-207f-4862-b172-f2ffab4a1500" containerID="d42c4c12bfedad564a910c45b2917090e91b1354f18a877a812d855bc764c9c3" exitCode=1 Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.504117 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" event={"ID":"bfeee7c1-207f-4862-b172-f2ffab4a1500","Type":"ContainerDied","Data":"d42c4c12bfedad564a910c45b2917090e91b1354f18a877a812d855bc764c9c3"} Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.504883 4693 scope.go:117] "RemoveContainer" containerID="d42c4c12bfedad564a910c45b2917090e91b1354f18a877a812d855bc764c9c3" Nov 25 12:32:17 crc kubenswrapper[4693]: E1125 12:32:17.505175 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=manila-operator-controller-manager-58bb8d67cc-5ghnq_openstack-operators(bfeee7c1-207f-4862-b172-f2ffab4a1500)\"" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" podUID="bfeee7c1-207f-4862-b172-f2ffab4a1500" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.514188 4693 generic.go:334] "Generic (PLEG): container finished" podID="7cb65a4e-3294-4104-b3bf-6d1103b92c38" containerID="8491d915b326a12eb45781446aab101877da8f76fd20abe6575ea9e33dfcaf96" exitCode=1 Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.514254 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" event={"ID":"7cb65a4e-3294-4104-b3bf-6d1103b92c38","Type":"ContainerDied","Data":"8491d915b326a12eb45781446aab101877da8f76fd20abe6575ea9e33dfcaf96"} Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.515398 4693 scope.go:117] "RemoveContainer" containerID="8491d915b326a12eb45781446aab101877da8f76fd20abe6575ea9e33dfcaf96" Nov 25 12:32:17 crc kubenswrapper[4693]: E1125 12:32:17.515696 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=glance-operator-controller-manager-68b95954c9-866fd_openstack-operators(7cb65a4e-3294-4104-b3bf-6d1103b92c38)\"" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" podUID="7cb65a4e-3294-4104-b3bf-6d1103b92c38" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.518686 4693 generic.go:334] "Generic (PLEG): container finished" podID="9cc5c4a9-0119-48b6-a795-9f482b55278b" containerID="de1061780f67cdb7bb611e2ffb2f5767cd0c0eaf777a843e7012a6209aa6f4da" exitCode=1 Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.518749 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" event={"ID":"9cc5c4a9-0119-48b6-a795-9f482b55278b","Type":"ContainerDied","Data":"de1061780f67cdb7bb611e2ffb2f5767cd0c0eaf777a843e7012a6209aa6f4da"} Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.519542 4693 scope.go:117] "RemoveContainer" containerID="de1061780f67cdb7bb611e2ffb2f5767cd0c0eaf777a843e7012a6209aa6f4da" Nov 25 12:32:17 crc kubenswrapper[4693]: E1125 12:32:17.519939 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=designate-operator-controller-manager-7d695c9b56-6dtx6_openstack-operators(9cc5c4a9-0119-48b6-a795-9f482b55278b)\"" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" podUID="9cc5c4a9-0119-48b6-a795-9f482b55278b" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.520912 4693 generic.go:334] "Generic (PLEG): container finished" podID="a64b0f5c-e6af-4903-925a-028aec5477fd" containerID="473ab5cc9f5e495f0026948091ceb90c208d92b0b297ed96075e8ea4e08d01a7" exitCode=1 Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.520973 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" event={"ID":"a64b0f5c-e6af-4903-925a-028aec5477fd","Type":"ContainerDied","Data":"473ab5cc9f5e495f0026948091ceb90c208d92b0b297ed96075e8ea4e08d01a7"} Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.521341 4693 scope.go:117] "RemoveContainer" containerID="473ab5cc9f5e495f0026948091ceb90c208d92b0b297ed96075e8ea4e08d01a7" Nov 25 12:32:17 crc kubenswrapper[4693]: E1125 12:32:17.521615 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-748dc6576f-zcpsz_openstack-operators(a64b0f5c-e6af-4903-925a-028aec5477fd)\"" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" podUID="a64b0f5c-e6af-4903-925a-028aec5477fd" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.721943 4693 scope.go:117] "RemoveContainer" containerID="c133d166c5bdf16a1e28b10129f73de03ab48a6583ec62dd9c5aa2f8c72b92e6" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.863240 4693 scope.go:117] "RemoveContainer" containerID="a8e22ea9c6297db7676d5003b03a7b98815c1563cb3cd4b959529c23feb4b068" Nov 25 12:32:17 crc kubenswrapper[4693]: I1125 12:32:17.958857 4693 scope.go:117] "RemoveContainer" containerID="68c3e5c6880f7109205001bca2c88ce5139e3c4f07e97bd9efc93eb7b54c536b" Nov 25 12:32:18 crc kubenswrapper[4693]: I1125 12:32:18.058802 4693 scope.go:117] "RemoveContainer" containerID="606ac57bd54bda616ef63549e79ade51118f68fc2004439f20bd3928b67d782c" Nov 25 12:32:18 crc kubenswrapper[4693]: I1125 12:32:18.128460 4693 scope.go:117] "RemoveContainer" containerID="04b9c09e42dc86f972d449310f260b434a79a4329b2f39c1dd3d390775a4a871" Nov 25 12:32:18 crc kubenswrapper[4693]: I1125 12:32:18.262275 4693 scope.go:117] "RemoveContainer" containerID="98f562f76f85fc439569aec3906d2175d30c8be267cc242d39a6d77c0ea98b82" Nov 25 12:32:18 crc kubenswrapper[4693]: I1125 12:32:18.505832 4693 scope.go:117] "RemoveContainer" containerID="eabf73ef1d8ddebd6d00e1e3b35c56f3423f2a0ec131aeeb95d9da26f0729f3e" Nov 25 12:32:18 crc kubenswrapper[4693]: I1125 12:32:18.544957 4693 generic.go:334] "Generic (PLEG): container finished" podID="28782f20-4534-4137-b590-7a3b31c638b2" containerID="8db3ebb13898f8b25875dbef6b6608b580d63375fb3c4f8a0b9bae3b59f6156c" exitCode=1 Nov 25 12:32:18 crc kubenswrapper[4693]: I1125 12:32:18.545151 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" event={"ID":"28782f20-4534-4137-b590-7a3b31c638b2","Type":"ContainerDied","Data":"8db3ebb13898f8b25875dbef6b6608b580d63375fb3c4f8a0b9bae3b59f6156c"} Nov 25 12:32:18 crc kubenswrapper[4693]: I1125 12:32:18.546095 4693 scope.go:117] "RemoveContainer" containerID="8db3ebb13898f8b25875dbef6b6608b580d63375fb3c4f8a0b9bae3b59f6156c" Nov 25 12:32:18 crc kubenswrapper[4693]: E1125 12:32:18.546545 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=operator pod=rabbitmq-cluster-operator-manager-668c99d594-qbjp2_openstack-operators(28782f20-4534-4137-b590-7a3b31c638b2)\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" podUID="28782f20-4534-4137-b590-7a3b31c638b2" Nov 25 12:32:18 crc kubenswrapper[4693]: I1125 12:32:18.564851 4693 scope.go:117] "RemoveContainer" containerID="7c126151898ffab73ef187f632f68d5495628bc12473d8a45d20ebf6aa04f3be" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.135982 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.136069 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.137090 4693 scope.go:117] "RemoveContainer" containerID="06535766f329732625b6f90ef696ab32b44ec08c2e6f46ce58068be14e57d952" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.137698 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=barbican-operator-controller-manager-86dc4d89c8-6wxtj_openstack-operators(2f11c884-15fc-4e2a-a533-d0eac0639f80)\"" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" podUID="2f11c884-15fc-4e2a-a533-d0eac0639f80" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.154455 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.154514 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.155311 4693 scope.go:117] "RemoveContainer" containerID="131d8b9e16e8795b8042d2ca09411459ec391c6df8a164a8c3d613b036f0ec81" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.155680 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=cinder-operator-controller-manager-79856dc55c-4lt8v_openstack-operators(4ab70f55-282f-4509-bc36-71ef2fe4d35b)\"" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" podUID="4ab70f55-282f-4509-bc36-71ef2fe4d35b" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.196568 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.196626 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.197432 4693 scope.go:117] "RemoveContainer" containerID="de1061780f67cdb7bb611e2ffb2f5767cd0c0eaf777a843e7012a6209aa6f4da" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.197708 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=designate-operator-controller-manager-7d695c9b56-6dtx6_openstack-operators(9cc5c4a9-0119-48b6-a795-9f482b55278b)\"" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" podUID="9cc5c4a9-0119-48b6-a795-9f482b55278b" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.254758 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.254817 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.255562 4693 scope.go:117] "RemoveContainer" containerID="9591f3ac45e35952c205b0fa2e1f1b9950f2bc3dd784bb9ac266219b16b19065" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.255862 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=heat-operator-controller-manager-774b86978c-nzz29_openstack-operators(b29c9c21-026a-4701-99a7-769d382a2da2)\"" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" podUID="b29c9c21-026a-4701-99a7-769d382a2da2" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.301795 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.301851 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.302646 4693 scope.go:117] "RemoveContainer" containerID="08262cc82f0084020fdb0b7bbef1a21def26efff477125af0d4aa8a3410b850b" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.303050 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=horizon-operator-controller-manager-68c9694994-fwwsj_openstack-operators(4dd9cd53-1f66-4636-9fab-9f0b3ff38009)\"" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" podUID="4dd9cd53-1f66-4636-9fab-9f0b3ff38009" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.453543 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.453598 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.454972 4693 scope.go:117] "RemoveContainer" containerID="27c5891a0eb8db55cb8f850039af58dc9697b7a9362fc7622192e963ab293556" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.455516 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=infra-operator-controller-manager-d5cc86f4b-r86ct_openstack-operators(5c98082e-070e-42b1-afdc-69cea132629e)\"" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" podUID="5c98082e-070e-42b1-afdc-69cea132629e" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.514597 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.514663 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.515509 4693 scope.go:117] "RemoveContainer" containerID="8491d915b326a12eb45781446aab101877da8f76fd20abe6575ea9e33dfcaf96" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.515856 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=glance-operator-controller-manager-68b95954c9-866fd_openstack-operators(7cb65a4e-3294-4104-b3bf-6d1103b92c38)\"" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" podUID="7cb65a4e-3294-4104-b3bf-6d1103b92c38" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.570443 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.570489 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.571146 4693 scope.go:117] "RemoveContainer" containerID="3e7456d5d3738801a4ed8914bfe561a50c956903bcf9bc7669432ec6f3e8d30c" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.571437 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ironic-operator-controller-manager-5bfcdc958c-szrv4_openstack-operators(3c29e8b9-57cf-4967-b5e2-a6af42c16099)\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" podUID="3c29e8b9-57cf-4967-b5e2-a6af42c16099" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.583808 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.583869 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.584564 4693 scope.go:117] "RemoveContainer" containerID="473ab5cc9f5e495f0026948091ceb90c208d92b0b297ed96075e8ea4e08d01a7" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.584884 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-748dc6576f-zcpsz_openstack-operators(a64b0f5c-e6af-4903-925a-028aec5477fd)\"" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" podUID="a64b0f5c-e6af-4903-925a-028aec5477fd" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.616186 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.616438 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.616658 4693 scope.go:117] "RemoveContainer" containerID="d42c4c12bfedad564a910c45b2917090e91b1354f18a877a812d855bc764c9c3" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.616904 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=manila-operator-controller-manager-58bb8d67cc-5ghnq_openstack-operators(bfeee7c1-207f-4862-b172-f2ffab4a1500)\"" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" podUID="bfeee7c1-207f-4862-b172-f2ffab4a1500" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.638782 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.638829 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.640769 4693 scope.go:117] "RemoveContainer" containerID="f924c555e294ce79df52550a36a751d3db1c3720b8db55d8ae80a2dced02311b" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.641409 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=mariadb-operator-controller-manager-cb6c4fdb7-s9shw_openstack-operators(22a83ecc-1f72-4474-a470-2ee4bef7eddf)\"" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" podUID="22a83ecc-1f72-4474-a470-2ee4bef7eddf" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.697908 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.697995 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.698720 4693 scope.go:117] "RemoveContainer" containerID="132c7b926ee6482acad15da9715a9fed6619442a7e93dc27da6375a9d1f95082" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.699195 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=neutron-operator-controller-manager-7c57c8bbc4-csrpt_openstack-operators(0f35f544-581e-4cb2-900f-71213e27477d)\"" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" podUID="0f35f544-581e-4cb2-900f-71213e27477d" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.763857 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.763940 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.764737 4693 scope.go:117] "RemoveContainer" containerID="3f44b807f8c7da0ee9410ad2dfbb6c2020c6a817f2a6412ecf9abf9259dc183a" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.765019 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=nova-operator-controller-manager-79556f57fc-flxdz_openstack-operators(7ecc8c23-d9b2-4d46-a8b0-76758035b267)\"" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" podUID="7ecc8c23-d9b2-4d46-a8b0-76758035b267" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.843539 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.843615 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.843970 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.844015 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.844482 4693 scope.go:117] "RemoveContainer" containerID="40493dd1f7193de376cf060f08fbbdaf773d14634bbd5fdd8e992107dce4cc9c" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.844788 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ovn-operator-controller-manager-66cf5c67ff-k2njb_openstack-operators(1c7db975-17d7-48dd-8e5a-0549749ab866)\"" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" podUID="1c7db975-17d7-48dd-8e5a-0549749ab866" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.844907 4693 scope.go:117] "RemoveContainer" containerID="90154e0c65b206bd7f11b146252026bf49c29e8a600c5c7dc1ed554dd1cc4eca" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.845198 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=octavia-operator-controller-manager-fd75fd47d-g972v_openstack-operators(fe2a0074-66dc-4730-9321-772ee8fd8e28)\"" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" podUID="fe2a0074-66dc-4730-9321-772ee8fd8e28" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.897382 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.897618 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.898824 4693 scope.go:117] "RemoveContainer" containerID="17de0afe28901c5936bb9c0db152ea67ba9b918bd046d5e4fc2e53e67af688ec" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.899232 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=placement-operator-controller-manager-5db546f9d9-f4trp_openstack-operators(f6bc1c64-200f-492f-bad9-dfecd5687698)\"" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" podUID="f6bc1c64-200f-492f-bad9-dfecd5687698" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.930845 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.931095 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" Nov 25 12:32:19 crc kubenswrapper[4693]: I1125 12:32:19.931790 4693 scope.go:117] "RemoveContainer" containerID="3398f0e87aacd1d0644fb32e02b5520ecc324408aad1d8f9e4117a34cd6e9623" Nov 25 12:32:19 crc kubenswrapper[4693]: E1125 12:32:19.932125 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=swift-operator-controller-manager-6fdc4fcf86-bnf27_openstack-operators(c3a7c8cb-ac3c-43d3-b38d-0c3625c53196)\"" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" podUID="c3a7c8cb-ac3c-43d3-b38d-0c3625c53196" Nov 25 12:32:20 crc kubenswrapper[4693]: I1125 12:32:20.190414 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" Nov 25 12:32:20 crc kubenswrapper[4693]: I1125 12:32:20.190480 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" Nov 25 12:32:20 crc kubenswrapper[4693]: I1125 12:32:20.191188 4693 scope.go:117] "RemoveContainer" containerID="56023f15cd0a1e16d627b8e19cf69c999dbe8a18ef01ce954fe3403a4319e909" Nov 25 12:32:20 crc kubenswrapper[4693]: E1125 12:32:20.191482 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=telemetry-operator-controller-manager-567f98c9d-cwrvs_openstack-operators(b9227546-dcce-4b09-9311-19f844deb318)\"" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" podUID="b9227546-dcce-4b09-9311-19f844deb318" Nov 25 12:32:20 crc kubenswrapper[4693]: I1125 12:32:20.267386 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cb74df96-kmpm8" Nov 25 12:32:20 crc kubenswrapper[4693]: I1125 12:32:20.566306 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" Nov 25 12:32:20 crc kubenswrapper[4693]: I1125 12:32:20.567573 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" Nov 25 12:32:20 crc kubenswrapper[4693]: I1125 12:32:20.568249 4693 scope.go:117] "RemoveContainer" containerID="389dec33bbdd099de036486f78b012d7fd9380277e97ee8d6f6bdfeb334c84f3" Nov 25 12:32:20 crc kubenswrapper[4693]: E1125 12:32:20.568507 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=watcher-operator-controller-manager-864885998-tc9jb_openstack-operators(105791fd-407d-44a3-8fc8-af90e82b0f63)\"" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" podUID="105791fd-407d-44a3-8fc8-af90e82b0f63" Nov 25 12:32:20 crc kubenswrapper[4693]: I1125 12:32:20.575536 4693 scope.go:117] "RemoveContainer" containerID="d42c4c12bfedad564a910c45b2917090e91b1354f18a877a812d855bc764c9c3" Nov 25 12:32:20 crc kubenswrapper[4693]: I1125 12:32:20.575579 4693 scope.go:117] "RemoveContainer" containerID="17de0afe28901c5936bb9c0db152ea67ba9b918bd046d5e4fc2e53e67af688ec" Nov 25 12:32:20 crc kubenswrapper[4693]: I1125 12:32:20.575614 4693 scope.go:117] "RemoveContainer" containerID="3e7456d5d3738801a4ed8914bfe561a50c956903bcf9bc7669432ec6f3e8d30c" Nov 25 12:32:20 crc kubenswrapper[4693]: E1125 12:32:20.575876 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=ironic-operator-controller-manager-5bfcdc958c-szrv4_openstack-operators(3c29e8b9-57cf-4967-b5e2-a6af42c16099)\"" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" podUID="3c29e8b9-57cf-4967-b5e2-a6af42c16099" Nov 25 12:32:20 crc kubenswrapper[4693]: E1125 12:32:20.575928 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=placement-operator-controller-manager-5db546f9d9-f4trp_openstack-operators(f6bc1c64-200f-492f-bad9-dfecd5687698)\"" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" podUID="f6bc1c64-200f-492f-bad9-dfecd5687698" Nov 25 12:32:20 crc kubenswrapper[4693]: E1125 12:32:20.575959 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=manila-operator-controller-manager-58bb8d67cc-5ghnq_openstack-operators(bfeee7c1-207f-4862-b172-f2ffab4a1500)\"" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" podUID="bfeee7c1-207f-4862-b172-f2ffab4a1500" Nov 25 12:32:20 crc kubenswrapper[4693]: I1125 12:32:20.868291 4693 scope.go:117] "RemoveContainer" containerID="9276d800f34d5243ede48bbfd8f32cad496ae6ac720cacf0b6ce0acc117dae10" Nov 25 12:32:21 crc kubenswrapper[4693]: I1125 12:32:21.587256 4693 generic.go:334] "Generic (PLEG): container finished" podID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" containerID="0dd439932dc2065c6315289be8b718978b417112a7eb20be9309352a74dc3898" exitCode=1 Nov 25 12:32:21 crc kubenswrapper[4693]: I1125 12:32:21.587337 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" event={"ID":"0d2b9e6f-fe11-47e3-af7b-cca0fff65798","Type":"ContainerDied","Data":"0dd439932dc2065c6315289be8b718978b417112a7eb20be9309352a74dc3898"} Nov 25 12:32:21 crc kubenswrapper[4693]: I1125 12:32:21.587647 4693 scope.go:117] "RemoveContainer" containerID="9276d800f34d5243ede48bbfd8f32cad496ae6ac720cacf0b6ce0acc117dae10" Nov 25 12:32:21 crc kubenswrapper[4693]: I1125 12:32:21.588418 4693 scope.go:117] "RemoveContainer" containerID="0dd439932dc2065c6315289be8b718978b417112a7eb20be9309352a74dc3898" Nov 25 12:32:21 crc kubenswrapper[4693]: E1125 12:32:21.588894 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=metallb-operator-controller-manager-5995bbfc5f-c8gkc_metallb-system(0d2b9e6f-fe11-47e3-af7b-cca0fff65798)\"" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" podUID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" Nov 25 12:32:21 crc kubenswrapper[4693]: I1125 12:32:21.981854 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7b567956b5-gk28d" Nov 25 12:32:22 crc kubenswrapper[4693]: I1125 12:32:22.441618 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/kube-state-metrics-0" Nov 25 12:32:22 crc kubenswrapper[4693]: I1125 12:32:22.441678 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 25 12:32:22 crc kubenswrapper[4693]: I1125 12:32:22.442410 4693 scope.go:117] "RemoveContainer" containerID="cad48c1e5032e0c7214de93640ad44a9ae54028f19294125b41d143c22c68223" Nov 25 12:32:22 crc kubenswrapper[4693]: E1125 12:32:22.442645 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-state-metrics pod=kube-state-metrics-0_openstack(ee5b4281-3cdb-4bad-8002-8520136232a4)\"" pod="openstack/kube-state-metrics-0" podUID="ee5b4281-3cdb-4bad-8002-8520136232a4" Nov 25 12:32:23 crc kubenswrapper[4693]: I1125 12:32:23.404793 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-b58f89467-jlbhg" Nov 25 12:32:23 crc kubenswrapper[4693]: I1125 12:32:23.495800 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 25 12:32:23 crc kubenswrapper[4693]: I1125 12:32:23.856024 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-bbkk2" Nov 25 12:32:24 crc kubenswrapper[4693]: I1125 12:32:24.203314 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:32:24 crc kubenswrapper[4693]: I1125 12:32:24.203362 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:32:24 crc kubenswrapper[4693]: I1125 12:32:24.204063 4693 scope.go:117] "RemoveContainer" containerID="b01e14a06ec7f21819ed59ce624d425a1074b68d1e4626645e57833ac79e0413" Nov 25 12:32:24 crc kubenswrapper[4693]: E1125 12:32:24.204323 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=openstack-operator-controller-manager-7cd5954d9-rqjq9_openstack-operators(c80a0f65-6193-435f-8138-eb5a4ba71b22)\"" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" podUID="c80a0f65-6193-435f-8138-eb5a4ba71b22" Nov 25 12:32:24 crc kubenswrapper[4693]: I1125 12:32:24.286725 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 25 12:32:24 crc kubenswrapper[4693]: I1125 12:32:24.398955 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 25 12:32:24 crc kubenswrapper[4693]: I1125 12:32:24.818875 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 25 12:32:24 crc kubenswrapper[4693]: I1125 12:32:24.842956 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 25 12:32:24 crc kubenswrapper[4693]: I1125 12:32:24.904070 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 25 12:32:24 crc kubenswrapper[4693]: I1125 12:32:24.950818 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 25 12:32:25 crc kubenswrapper[4693]: I1125 12:32:25.167502 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 25 12:32:25 crc kubenswrapper[4693]: I1125 12:32:25.207426 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 25 12:32:25 crc kubenswrapper[4693]: I1125 12:32:25.240097 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 25 12:32:25 crc kubenswrapper[4693]: I1125 12:32:25.323266 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 25 12:32:25 crc kubenswrapper[4693]: I1125 12:32:25.437105 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 25 12:32:25 crc kubenswrapper[4693]: I1125 12:32:25.542340 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 25 12:32:25 crc kubenswrapper[4693]: I1125 12:32:25.639580 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 25 12:32:25 crc kubenswrapper[4693]: I1125 12:32:25.829465 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 25 12:32:25 crc kubenswrapper[4693]: I1125 12:32:25.855562 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 25 12:32:25 crc kubenswrapper[4693]: I1125 12:32:25.865881 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-h7bgw" Nov 25 12:32:25 crc kubenswrapper[4693]: I1125 12:32:25.949017 4693 scope.go:117] "RemoveContainer" containerID="dbd83a61272aad32773d0082cb1579c348d783dd55b8c14070663bfe58a4a673" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.005547 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.018287 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-nl7sc" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.111714 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.229766 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-9jjvz" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.349895 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.376762 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.376775 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.421802 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.450318 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hk688" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.568421 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.580334 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.750097 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.807519 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.825238 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.887435 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.943805 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.981099 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 25 12:32:26 crc kubenswrapper[4693]: I1125 12:32:26.984785 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.028414 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-spb4r" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.035933 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.038532 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.045636 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-nt867" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.045648 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.098502 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.160240 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.162949 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.289279 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f892l" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.330997 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.396067 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4cpkn" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.496437 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.563791 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.635830 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.695172 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.704597 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.705495 4693 scope.go:117] "RemoveContainer" containerID="0dd439932dc2065c6315289be8b718978b417112a7eb20be9309352a74dc3898" Nov 25 12:32:27 crc kubenswrapper[4693]: E1125 12:32:27.705901 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=metallb-operator-controller-manager-5995bbfc5f-c8gkc_metallb-system(0d2b9e6f-fe11-47e3-af7b-cca0fff65798)\"" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" podUID="0d2b9e6f-fe11-47e3-af7b-cca0fff65798" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.727567 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.775642 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.801436 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.831948 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.938818 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 25 12:32:27 crc kubenswrapper[4693]: I1125 12:32:27.948509 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.015107 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.034208 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.067707 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.113823 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.114984 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.128017 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.160810 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.171349 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.202169 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.209246 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.226413 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.231870 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.244581 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.274051 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.285217 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.290684 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.291166 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-j2j9t" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.317869 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.325326 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.348652 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.416414 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.422899 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.437872 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-p5jhb" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.573984 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.575052 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.623762 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lfxmx" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.658242 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.683150 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.687772 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.701479 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.715594 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-6qh5n" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.722729 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.723397 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.731179 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.806714 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.807769 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.817547 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.893518 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.917160 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.954501 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.984751 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 25 12:32:28 crc kubenswrapper[4693]: I1125 12:32:28.992096 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.072187 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-c4xvn" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.107430 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.137590 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.168430 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.230637 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-t8gn6" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.232068 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.236036 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.242661 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.243203 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.251447 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-q2hwj" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.262042 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.263795 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.279199 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.331966 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.453734 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.475395 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-mmnnt" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.482193 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.533663 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.572123 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.589320 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.598471 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.606018 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.632430 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.655653 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.656249 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qcnb8" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.680472 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.711953 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.786081 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.811455 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.816799 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.817910 4693 scope.go:117] "RemoveContainer" containerID="131d8b9e16e8795b8042d2ca09411459ec391c6df8a164a8c3d613b036f0ec81" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.872163 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.877527 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f2cgs" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.877935 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.885256 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.899085 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.911193 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-rq5lp" Nov 25 12:32:29 crc kubenswrapper[4693]: I1125 12:32:29.972101 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.021993 4693 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.035479 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-vcjwk" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.099291 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.100576 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-slsbp" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.105665 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.115124 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.135472 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.168664 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-kjtkc" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.267768 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.311960 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.360943 4693 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.373319 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.435207 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-n2jf7" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.545726 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.617610 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.651724 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.670817 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" event={"ID":"4ab70f55-282f-4509-bc36-71ef2fe4d35b","Type":"ContainerStarted","Data":"b7fd12369a07a7ee383525878ba9a8890f153e580341b7d6737fc8cd7aaea194"} Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.671218 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.692784 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.719279 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.737699 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.750943 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ghd6w" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.791977 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5mjsr" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.796960 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.823519 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.825775 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.838143 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.841905 4693 scope.go:117] "RemoveContainer" containerID="3f44b807f8c7da0ee9410ad2dfbb6c2020c6a817f2a6412ecf9abf9259dc183a" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.842238 4693 scope.go:117] "RemoveContainer" containerID="473ab5cc9f5e495f0026948091ceb90c208d92b0b297ed96075e8ea4e08d01a7" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.843120 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.843415 4693 scope.go:117] "RemoveContainer" containerID="8db3ebb13898f8b25875dbef6b6608b580d63375fb3c4f8a0b9bae3b59f6156c" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.924713 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.967600 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 25 12:32:30 crc kubenswrapper[4693]: I1125 12:32:30.979305 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.024299 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.047249 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.052188 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.126552 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.248047 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.254693 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-949sh" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.254781 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.270969 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-sh496" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.310342 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.313040 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.313610 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.362972 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.378503 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-62xn5" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.389849 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.398840 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.428738 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.437356 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.469958 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.470182 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.475788 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.487066 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.514585 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.519088 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.532335 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-t2wvn" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.539903 4693 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.552577 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.552641 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.553011 4693 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c154f2a-1272-4355-9c90-4ba1ac6b7118" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.553041 4693 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5c154f2a-1272-4355-9c90-4ba1ac6b7118" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.557540 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.571314 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.577061 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-km2nn" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.579583 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.57956819 podStartE2EDuration="17.57956819s" podCreationTimestamp="2025-11-25 12:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 12:32:31.571104224 +0000 UTC m=+1471.489189615" watchObservedRunningTime="2025-11-25 12:32:31.57956819 +0000 UTC m=+1471.497653571" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.579967 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-jxwmh" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.639770 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.647988 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.649432 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.672487 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.681528 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" event={"ID":"a64b0f5c-e6af-4903-925a-028aec5477fd","Type":"ContainerStarted","Data":"e954662dae7cc79c17034df3b180d7922557ac91484c4953dcb00ff5a235ce53"} Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.681800 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.683671 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" event={"ID":"7ecc8c23-d9b2-4d46-a8b0-76758035b267","Type":"ContainerStarted","Data":"479037cb982724aa0e3e3e5213bb87bcfe2323bd018c1b63af330f23062a1fa8"} Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.684573 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.687232 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-qbjp2" event={"ID":"28782f20-4534-4137-b590-7a3b31c638b2","Type":"ContainerStarted","Data":"743fd654bf1920ea20c01ce11ba882f3961c7455507bf380eeef4559b4aeb2b1"} Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.698576 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.737722 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.745270 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-drzcq" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.781653 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.813215 4693 scope.go:117] "RemoveContainer" containerID="de1061780f67cdb7bb611e2ffb2f5767cd0c0eaf777a843e7012a6209aa6f4da" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.813482 4693 scope.go:117] "RemoveContainer" containerID="08262cc82f0084020fdb0b7bbef1a21def26efff477125af0d4aa8a3410b850b" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.813631 4693 scope.go:117] "RemoveContainer" containerID="3398f0e87aacd1d0644fb32e02b5520ecc324408aad1d8f9e4117a34cd6e9623" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.813670 4693 scope.go:117] "RemoveContainer" containerID="9591f3ac45e35952c205b0fa2e1f1b9950f2bc3dd784bb9ac266219b16b19065" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.813829 4693 scope.go:117] "RemoveContainer" containerID="06535766f329732625b6f90ef696ab32b44ec08c2e6f46ce58068be14e57d952" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.885063 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.927780 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 25 12:32:31 crc kubenswrapper[4693]: I1125 12:32:31.941343 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.012872 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.015706 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.043559 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.177248 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.213973 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.216448 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.279596 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.327195 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.346071 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.408694 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.483079 4693 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.484890 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.540789 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.649602 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.655090 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.698036 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" event={"ID":"b29c9c21-026a-4701-99a7-769d382a2da2","Type":"ContainerStarted","Data":"1558a96c20ce6f77bfa0a658033710dcced5b86ab2e190ad1b76a664c8e5b80f"} Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.698318 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.700184 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" event={"ID":"2f11c884-15fc-4e2a-a533-d0eac0639f80","Type":"ContainerStarted","Data":"34a63753b50d7a226fb6139325f146fd5cc412b16f44475483173ad9db52a20e"} Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.700404 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.702307 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" event={"ID":"4dd9cd53-1f66-4636-9fab-9f0b3ff38009","Type":"ContainerStarted","Data":"0426ba1745a8db6f7226a0ad80f53f0aed1e8b34ef1a7981752558619b3ab43c"} Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.702598 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.706525 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" event={"ID":"c3a7c8cb-ac3c-43d3-b38d-0c3625c53196","Type":"ContainerStarted","Data":"f4e09d5f435d0d1e50f75134c57cc9cdb7c7f2de5a75da4ad2c0b9bd042a55dd"} Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.706739 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.710401 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" event={"ID":"9cc5c4a9-0119-48b6-a795-9f482b55278b","Type":"ContainerStarted","Data":"dc7f6f78b2f0510339be85da6ad8cdf7f76aae9ce8df89a59c061f712829d805"} Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.758644 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.773150 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.787277 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.794252 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.801620 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.812962 4693 scope.go:117] "RemoveContainer" containerID="f924c555e294ce79df52550a36a751d3db1c3720b8db55d8ae80a2dced02311b" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.813067 4693 scope.go:117] "RemoveContainer" containerID="8491d915b326a12eb45781446aab101877da8f76fd20abe6575ea9e33dfcaf96" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.813289 4693 scope.go:117] "RemoveContainer" containerID="17de0afe28901c5936bb9c0db152ea67ba9b918bd046d5e4fc2e53e67af688ec" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.813434 4693 scope.go:117] "RemoveContainer" containerID="27c5891a0eb8db55cb8f850039af58dc9697b7a9362fc7622192e963ab293556" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.910574 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6pxfk" Nov 25 12:32:32 crc kubenswrapper[4693]: I1125 12:32:32.960015 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.065135 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-5n4d9" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.109091 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-q867m" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.123499 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.190982 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p89nn" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.245509 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.267227 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.277803 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.295435 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-hbbx6" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.297475 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.304080 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.304160 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.420183 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.707659 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.721757 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mgh26" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.722508 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" event={"ID":"7cb65a4e-3294-4104-b3bf-6d1103b92c38","Type":"ContainerStarted","Data":"14c25e96027b2d769f3a3a5b4370529329ca1a5b958b79eb645423381b92d887"} Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.722717 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.725424 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" event={"ID":"22a83ecc-1f72-4474-a470-2ee4bef7eddf","Type":"ContainerStarted","Data":"ddfaf0598b7460e9f7f7f1c7840fc019a5f2847dd3620166dda668f5912877e4"} Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.726002 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.728775 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" event={"ID":"5c98082e-070e-42b1-afdc-69cea132629e","Type":"ContainerStarted","Data":"8ccb268cbba5a2992215912d4aaebff0fb63ff197f7fd65fe3260ef1361ebff1"} Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.729580 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.731815 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" event={"ID":"f6bc1c64-200f-492f-bad9-dfecd5687698","Type":"ContainerStarted","Data":"6bc1ba380fbf38df6a077865b14c5063c3139f4d728a4342d2366b9a83caea7b"} Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.763604 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.813158 4693 scope.go:117] "RemoveContainer" containerID="d42c4c12bfedad564a910c45b2917090e91b1354f18a877a812d855bc764c9c3" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.834598 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.906712 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.920062 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.941593 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.952089 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 25 12:32:33 crc kubenswrapper[4693]: I1125 12:32:33.982814 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-gk7lv" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.072578 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.209647 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.301648 4693 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-snzcd" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.325047 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.334021 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.359668 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.372550 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.389589 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.394142 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.418612 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.419699 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.449662 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.452548 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.468815 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-pgsm6" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.519501 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.522683 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.524064 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.524117 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.606952 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.644667 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.657483 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.673484 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.708566 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.713155 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.728920 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.743563 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" event={"ID":"bfeee7c1-207f-4862-b172-f2ffab4a1500","Type":"ContainerStarted","Data":"c6bb5b735032bcc5dc75edc913986d207ca1dd6d06ae5c27f998549006efdced"} Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.743961 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.813218 4693 scope.go:117] "RemoveContainer" containerID="40493dd1f7193de376cf060f08fbbdaf773d14634bbd5fdd8e992107dce4cc9c" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.813396 4693 scope.go:117] "RemoveContainer" containerID="90154e0c65b206bd7f11b146252026bf49c29e8a600c5c7dc1ed554dd1cc4eca" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.813581 4693 scope.go:117] "RemoveContainer" containerID="3e7456d5d3738801a4ed8914bfe561a50c956903bcf9bc7669432ec6f3e8d30c" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.814068 4693 scope.go:117] "RemoveContainer" containerID="132c7b926ee6482acad15da9715a9fed6619442a7e93dc27da6375a9d1f95082" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.814281 4693 scope.go:117] "RemoveContainer" containerID="56023f15cd0a1e16d627b8e19cf69c999dbe8a18ef01ce954fe3403a4319e909" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.814607 4693 scope.go:117] "RemoveContainer" containerID="b01e14a06ec7f21819ed59ce624d425a1074b68d1e4626645e57833ac79e0413" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.821048 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 25 12:32:34 crc kubenswrapper[4693]: I1125 12:32:34.962155 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.038012 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.038574 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.043109 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.102948 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-lbvsr" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.142453 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.152432 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.159883 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.196023 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.248334 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.268638 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.318458 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.351522 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.431538 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.466424 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.468419 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.485508 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.495086 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.542938 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.550033 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.595315 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.617442 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.641223 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.666407 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.754894 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" event={"ID":"1c7db975-17d7-48dd-8e5a-0549749ab866","Type":"ContainerStarted","Data":"1b7d6f45e6ad66362698ef844be66529beb051bd37d9c78941f823cfdb34f5f3"} Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.755527 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.757023 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" event={"ID":"fe2a0074-66dc-4730-9321-772ee8fd8e28","Type":"ContainerStarted","Data":"1be0d0d07f2da6991132ef8283be9b8edc8da93c48c45d21cfb008a52191b261"} Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.757268 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.759297 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" event={"ID":"c80a0f65-6193-435f-8138-eb5a4ba71b22","Type":"ContainerStarted","Data":"0abd73a9bb87096393d3cb7c186d40d415287aa57b11a1948adb3e8c534e86d7"} Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.759632 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.761252 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" event={"ID":"b9227546-dcce-4b09-9311-19f844deb318","Type":"ContainerStarted","Data":"4a3400ad6f53210f4b846fdb44e21fc5770e58d1011e4c81296aceea3680331e"} Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.761491 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.763317 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" event={"ID":"0f35f544-581e-4cb2-900f-71213e27477d","Type":"ContainerStarted","Data":"99ec8984236bf95b198ff7c1a9afe2083bb844dd3cb7f1f6083c9454c47c8e7e"} Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.764209 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.766342 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" event={"ID":"3c29e8b9-57cf-4967-b5e2-a6af42c16099","Type":"ContainerStarted","Data":"148676c4e3b8f857f647f713c5a5390bbda7ba5be9b1d9644a8364b1ca048ddd"} Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.766564 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.769095 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-9jrzc" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.775670 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6lvzb" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.793397 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.795455 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.808018 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.812644 4693 scope.go:117] "RemoveContainer" containerID="389dec33bbdd099de036486f78b012d7fd9380277e97ee8d6f6bdfeb334c84f3" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.858544 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.879332 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.900470 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 25 12:32:35 crc kubenswrapper[4693]: I1125 12:32:35.966061 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.021720 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.048526 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.056443 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.064787 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.070439 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.136589 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.140816 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j4vmt" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.180223 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.185275 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.213086 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-dxsx4" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.238308 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.278956 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.299204 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.362725 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.393042 4693 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.401867 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-rzw75" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.414124 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.436616 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.469772 4693 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-zlsqm" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.498514 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.628357 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.634297 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.652694 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lbns7" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.711701 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.716870 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.758925 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.782526 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.788049 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" event={"ID":"105791fd-407d-44a3-8fc8-af90e82b0f63","Type":"ContainerStarted","Data":"4d26cb1c501274827f166a53c8c1dff0facc810ac60f01e4a2d55c5dc3fee659"} Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.790806 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.798760 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.805723 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.846674 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.889872 4693 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.890119 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://54d9593aaff2d252f0e0e14da995e7c4ab8a2275156e2adb42bb9be847ad1448" gracePeriod=5 Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.895765 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.945787 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.975927 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 25 12:32:36 crc kubenswrapper[4693]: I1125 12:32:36.990046 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-c9m8g" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.012472 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.034706 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.050979 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.051590 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.058616 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.253230 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.380024 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.398501 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.427191 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.490071 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.554609 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.638318 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.648192 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.738546 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.814270 4693 scope.go:117] "RemoveContainer" containerID="cad48c1e5032e0c7214de93640ad44a9ae54028f19294125b41d143c22c68223" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.859720 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.874834 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.876430 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.909260 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.975012 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.977464 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 25 12:32:37 crc kubenswrapper[4693]: I1125 12:32:37.990404 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.000271 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.118614 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.121088 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.268419 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.344846 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.355671 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.371234 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.414765 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-828mx" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.425440 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.436197 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.454086 4693 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.567500 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.622044 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.624636 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.636167 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.636777 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.637397 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.677748 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.758032 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.772874 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.783075 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.807984 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ee5b4281-3cdb-4bad-8002-8520136232a4","Type":"ContainerStarted","Data":"890a44be175ce75432ec01280add2830f9aadfc82170b5558848e46a08c0fa12"} Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.808235 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.859637 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.881507 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.903757 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 25 12:32:38 crc kubenswrapper[4693]: I1125 12:32:38.948928 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.018609 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.029104 4693 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-djldv" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.031028 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.120596 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.127490 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.134724 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.138538 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-86dc4d89c8-6wxtj" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.153696 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.156249 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79856dc55c-4lt8v" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.195981 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.196048 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.198124 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-7d695c9b56-6dtx6" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.257859 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.259429 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-774b86978c-nzz29" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.285965 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.298989 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.307038 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c9694994-fwwsj" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.332873 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.461444 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-d5cc86f4b-r86ct" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.464406 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.517496 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-68b95954c9-866fd" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.586185 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-748dc6576f-zcpsz" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.601937 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.618995 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-58bb8d67cc-5ghnq" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.620092 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.631394 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.648217 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-cb6c4fdb7-s9shw" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.652602 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.675922 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.680906 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.734147 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.769449 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-flxdz" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.837471 4693 generic.go:334] "Generic (PLEG): container finished" podID="886fc2dd-e1c6-4822-b516-1540c9e77f39" containerID="bb4263ecb7b497deddc68a0dc14305da446c89c7f0dcbc4a5a09a6d6bb5e715e" exitCode=1 Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.838060 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-krwsc" event={"ID":"886fc2dd-e1c6-4822-b516-1540c9e77f39","Type":"ContainerDied","Data":"bb4263ecb7b497deddc68a0dc14305da446c89c7f0dcbc4a5a09a6d6bb5e715e"} Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.838944 4693 scope.go:117] "RemoveContainer" containerID="bb4263ecb7b497deddc68a0dc14305da446c89c7f0dcbc4a5a09a6d6bb5e715e" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.904058 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.910285 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5db546f9d9-f4trp" Nov 25 12:32:39 crc kubenswrapper[4693]: I1125 12:32:39.936278 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6fdc4fcf86-bnf27" Nov 25 12:32:40 crc kubenswrapper[4693]: I1125 12:32:40.127426 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-8gxsw" Nov 25 12:32:40 crc kubenswrapper[4693]: I1125 12:32:40.131723 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 25 12:32:40 crc kubenswrapper[4693]: I1125 12:32:40.192919 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-567f98c9d-cwrvs" Nov 25 12:32:40 crc kubenswrapper[4693]: I1125 12:32:40.507951 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 25 12:32:40 crc kubenswrapper[4693]: I1125 12:32:40.735962 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 25 12:32:40 crc kubenswrapper[4693]: I1125 12:32:40.798240 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 25 12:32:40 crc kubenswrapper[4693]: I1125 12:32:40.849452 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-krwsc" event={"ID":"886fc2dd-e1c6-4822-b516-1540c9e77f39","Type":"ContainerStarted","Data":"dbc32968380a8932dcb54a009fff3369718c682931d6bc1ae7cb81ccbf9967fc"} Nov 25 12:32:40 crc kubenswrapper[4693]: I1125 12:32:40.931244 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-bn5rr" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.533477 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.533547 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.591657 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.591748 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.591871 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.591909 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.591939 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.591992 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.592058 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.592160 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.592296 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.592841 4693 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.592862 4693 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.592881 4693 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.592892 4693 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.600901 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.695093 4693 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.813605 4693 scope.go:117] "RemoveContainer" containerID="0dd439932dc2065c6315289be8b718978b417112a7eb20be9309352a74dc3898" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.830133 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.868052 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.868116 4693 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="54d9593aaff2d252f0e0e14da995e7c4ab8a2275156e2adb42bb9be847ad1448" exitCode=137 Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.868165 4693 scope.go:117] "RemoveContainer" containerID="54d9593aaff2d252f0e0e14da995e7c4ab8a2275156e2adb42bb9be847ad1448" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.868183 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.905254 4693 scope.go:117] "RemoveContainer" containerID="54d9593aaff2d252f0e0e14da995e7c4ab8a2275156e2adb42bb9be847ad1448" Nov 25 12:32:42 crc kubenswrapper[4693]: E1125 12:32:42.905794 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d9593aaff2d252f0e0e14da995e7c4ab8a2275156e2adb42bb9be847ad1448\": container with ID starting with 54d9593aaff2d252f0e0e14da995e7c4ab8a2275156e2adb42bb9be847ad1448 not found: ID does not exist" containerID="54d9593aaff2d252f0e0e14da995e7c4ab8a2275156e2adb42bb9be847ad1448" Nov 25 12:32:42 crc kubenswrapper[4693]: I1125 12:32:42.905838 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d9593aaff2d252f0e0e14da995e7c4ab8a2275156e2adb42bb9be847ad1448"} err="failed to get container status \"54d9593aaff2d252f0e0e14da995e7c4ab8a2275156e2adb42bb9be847ad1448\": rpc error: code = NotFound desc = could not find container \"54d9593aaff2d252f0e0e14da995e7c4ab8a2275156e2adb42bb9be847ad1448\": container with ID starting with 54d9593aaff2d252f0e0e14da995e7c4ab8a2275156e2adb42bb9be847ad1448 not found: ID does not exist" Nov 25 12:32:43 crc kubenswrapper[4693]: I1125 12:32:43.880900 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" event={"ID":"0d2b9e6f-fe11-47e3-af7b-cca0fff65798","Type":"ContainerStarted","Data":"bcfe8201800c4f5e16cf2abc54f495f734ee6512de4ea1856ef829367ac6c277"} Nov 25 12:32:43 crc kubenswrapper[4693]: I1125 12:32:43.881682 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:32:44 crc kubenswrapper[4693]: I1125 12:32:44.209447 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7cd5954d9-rqjq9" Nov 25 12:32:49 crc kubenswrapper[4693]: I1125 12:32:49.573064 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bfcdc958c-szrv4" Nov 25 12:32:49 crc kubenswrapper[4693]: I1125 12:32:49.700048 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7c57c8bbc4-csrpt" Nov 25 12:32:49 crc kubenswrapper[4693]: I1125 12:32:49.845284 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-66cf5c67ff-k2njb" Nov 25 12:32:49 crc kubenswrapper[4693]: I1125 12:32:49.845776 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-fd75fd47d-g972v" Nov 25 12:32:50 crc kubenswrapper[4693]: I1125 12:32:50.607558 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-864885998-tc9jb" Nov 25 12:32:50 crc kubenswrapper[4693]: I1125 12:32:50.871495 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 25 12:32:51 crc kubenswrapper[4693]: I1125 12:32:51.403066 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 25 12:32:51 crc kubenswrapper[4693]: I1125 12:32:51.878021 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 25 12:32:52 crc kubenswrapper[4693]: I1125 12:32:52.317868 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 25 12:32:52 crc kubenswrapper[4693]: I1125 12:32:52.450210 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 25 12:32:53 crc kubenswrapper[4693]: I1125 12:32:53.061831 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 25 12:32:54 crc kubenswrapper[4693]: I1125 12:32:54.489510 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 25 12:32:57 crc kubenswrapper[4693]: I1125 12:32:57.248263 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 25 12:32:57 crc kubenswrapper[4693]: I1125 12:32:57.374669 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.084674 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.617506 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.698149 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.978325 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qsr66"] Nov 25 12:33:00 crc kubenswrapper[4693]: E1125 12:33:00.978846 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" containerName="installer" Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.978867 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" containerName="installer" Nov 25 12:33:00 crc kubenswrapper[4693]: E1125 12:33:00.978891 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8577c4-f507-4e40-b284-66d57b0aee3d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.978898 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8577c4-f507-4e40-b284-66d57b0aee3d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 12:33:00 crc kubenswrapper[4693]: E1125 12:33:00.978917 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.978925 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.979109 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.979124 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa32523e-ff2b-4ce4-90a6-533c59472054" containerName="installer" Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.979135 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8577c4-f507-4e40-b284-66d57b0aee3d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.980591 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.988777 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-55s94"] Nov 25 12:33:00 crc kubenswrapper[4693]: I1125 12:33:00.990731 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.001059 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-55s94"] Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.031296 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qsr66"] Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.031296 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86631848-2a82-4a9e-b733-a44cf268a2e1-catalog-content\") pod \"redhat-operators-qsr66\" (UID: \"86631848-2a82-4a9e-b733-a44cf268a2e1\") " pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.031728 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86631848-2a82-4a9e-b733-a44cf268a2e1-utilities\") pod \"redhat-operators-qsr66\" (UID: \"86631848-2a82-4a9e-b733-a44cf268a2e1\") " pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.031838 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-catalog-content\") pod \"community-operators-55s94\" (UID: \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\") " pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.032005 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptmwz\" (UniqueName: \"kubernetes.io/projected/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-kube-api-access-ptmwz\") pod \"community-operators-55s94\" (UID: \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\") " pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.032224 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-utilities\") pod \"community-operators-55s94\" (UID: \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\") " pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.032356 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hcf9\" (UniqueName: \"kubernetes.io/projected/86631848-2a82-4a9e-b733-a44cf268a2e1-kube-api-access-7hcf9\") pod \"redhat-operators-qsr66\" (UID: \"86631848-2a82-4a9e-b733-a44cf268a2e1\") " pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.133203 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86631848-2a82-4a9e-b733-a44cf268a2e1-catalog-content\") pod \"redhat-operators-qsr66\" (UID: \"86631848-2a82-4a9e-b733-a44cf268a2e1\") " pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.133287 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86631848-2a82-4a9e-b733-a44cf268a2e1-utilities\") pod \"redhat-operators-qsr66\" (UID: \"86631848-2a82-4a9e-b733-a44cf268a2e1\") " pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.133307 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-catalog-content\") pod \"community-operators-55s94\" (UID: \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\") " pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.133345 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptmwz\" (UniqueName: \"kubernetes.io/projected/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-kube-api-access-ptmwz\") pod \"community-operators-55s94\" (UID: \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\") " pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.133445 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-utilities\") pod \"community-operators-55s94\" (UID: \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\") " pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.133466 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hcf9\" (UniqueName: \"kubernetes.io/projected/86631848-2a82-4a9e-b733-a44cf268a2e1-kube-api-access-7hcf9\") pod \"redhat-operators-qsr66\" (UID: \"86631848-2a82-4a9e-b733-a44cf268a2e1\") " pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.134015 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-catalog-content\") pod \"community-operators-55s94\" (UID: \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\") " pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.134309 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86631848-2a82-4a9e-b733-a44cf268a2e1-catalog-content\") pod \"redhat-operators-qsr66\" (UID: \"86631848-2a82-4a9e-b733-a44cf268a2e1\") " pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.134452 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-utilities\") pod \"community-operators-55s94\" (UID: \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\") " pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.134553 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86631848-2a82-4a9e-b733-a44cf268a2e1-utilities\") pod \"redhat-operators-qsr66\" (UID: \"86631848-2a82-4a9e-b733-a44cf268a2e1\") " pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.142611 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vx9wx" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.158657 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hcf9\" (UniqueName: \"kubernetes.io/projected/86631848-2a82-4a9e-b733-a44cf268a2e1-kube-api-access-7hcf9\") pod \"redhat-operators-qsr66\" (UID: \"86631848-2a82-4a9e-b733-a44cf268a2e1\") " pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.158691 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptmwz\" (UniqueName: \"kubernetes.io/projected/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-kube-api-access-ptmwz\") pod \"community-operators-55s94\" (UID: \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\") " pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.314420 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.331446 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.804661 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-55s94"] Nov 25 12:33:01 crc kubenswrapper[4693]: I1125 12:33:01.875838 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qsr66"] Nov 25 12:33:01 crc kubenswrapper[4693]: W1125 12:33:01.879035 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86631848_2a82_4a9e_b733_a44cf268a2e1.slice/crio-55125468c2aad2388f2c93a53200eb5edf506f1c4646cf0e0a3e91de0bbc1d7a WatchSource:0}: Error finding container 55125468c2aad2388f2c93a53200eb5edf506f1c4646cf0e0a3e91de0bbc1d7a: Status 404 returned error can't find the container with id 55125468c2aad2388f2c93a53200eb5edf506f1c4646cf0e0a3e91de0bbc1d7a Nov 25 12:33:02 crc kubenswrapper[4693]: I1125 12:33:02.052774 4693 generic.go:334] "Generic (PLEG): container finished" podID="86631848-2a82-4a9e-b733-a44cf268a2e1" containerID="b46c3a7dca1e96621e0b30b9eebebefdf6a5b73e8b0e1966196483a4f6e6b679" exitCode=0 Nov 25 12:33:02 crc kubenswrapper[4693]: I1125 12:33:02.052846 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsr66" event={"ID":"86631848-2a82-4a9e-b733-a44cf268a2e1","Type":"ContainerDied","Data":"b46c3a7dca1e96621e0b30b9eebebefdf6a5b73e8b0e1966196483a4f6e6b679"} Nov 25 12:33:02 crc kubenswrapper[4693]: I1125 12:33:02.052870 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsr66" event={"ID":"86631848-2a82-4a9e-b733-a44cf268a2e1","Type":"ContainerStarted","Data":"55125468c2aad2388f2c93a53200eb5edf506f1c4646cf0e0a3e91de0bbc1d7a"} Nov 25 12:33:02 crc kubenswrapper[4693]: I1125 12:33:02.055083 4693 generic.go:334] "Generic (PLEG): container finished" podID="0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" containerID="8a747428058f97fe53f4ab6f66974070a077721006f1031c7e0292d42a6f05a1" exitCode=0 Nov 25 12:33:02 crc kubenswrapper[4693]: I1125 12:33:02.055115 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55s94" event={"ID":"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef","Type":"ContainerDied","Data":"8a747428058f97fe53f4ab6f66974070a077721006f1031c7e0292d42a6f05a1"} Nov 25 12:33:02 crc kubenswrapper[4693]: I1125 12:33:02.055134 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55s94" event={"ID":"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef","Type":"ContainerStarted","Data":"39d6f942efd1fe19b839d3f04affc0160991c996bc38a23299499bef11cae271"} Nov 25 12:33:02 crc kubenswrapper[4693]: I1125 12:33:02.316886 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 25 12:33:03 crc kubenswrapper[4693]: I1125 12:33:03.069093 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsr66" event={"ID":"86631848-2a82-4a9e-b733-a44cf268a2e1","Type":"ContainerStarted","Data":"3542eb3e862c51ca736d20f33d2991b7525be3b164c55687246af3810ec8a79a"} Nov 25 12:33:03 crc kubenswrapper[4693]: I1125 12:33:03.072784 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55s94" event={"ID":"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef","Type":"ContainerStarted","Data":"2cc1f9463801889179f5ca76ff637b11718c79405db701059f2e4fb854a05c64"} Nov 25 12:33:03 crc kubenswrapper[4693]: I1125 12:33:03.484176 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 25 12:33:04 crc kubenswrapper[4693]: I1125 12:33:04.084202 4693 generic.go:334] "Generic (PLEG): container finished" podID="0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" containerID="2cc1f9463801889179f5ca76ff637b11718c79405db701059f2e4fb854a05c64" exitCode=0 Nov 25 12:33:04 crc kubenswrapper[4693]: I1125 12:33:04.084286 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55s94" event={"ID":"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef","Type":"ContainerDied","Data":"2cc1f9463801889179f5ca76ff637b11718c79405db701059f2e4fb854a05c64"} Nov 25 12:33:04 crc kubenswrapper[4693]: I1125 12:33:04.088758 4693 generic.go:334] "Generic (PLEG): container finished" podID="86631848-2a82-4a9e-b733-a44cf268a2e1" containerID="3542eb3e862c51ca736d20f33d2991b7525be3b164c55687246af3810ec8a79a" exitCode=0 Nov 25 12:33:04 crc kubenswrapper[4693]: I1125 12:33:04.088787 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsr66" event={"ID":"86631848-2a82-4a9e-b733-a44cf268a2e1","Type":"ContainerDied","Data":"3542eb3e862c51ca736d20f33d2991b7525be3b164c55687246af3810ec8a79a"} Nov 25 12:33:05 crc kubenswrapper[4693]: I1125 12:33:05.101410 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55s94" event={"ID":"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef","Type":"ContainerStarted","Data":"28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375"} Nov 25 12:33:05 crc kubenswrapper[4693]: I1125 12:33:05.105962 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsr66" event={"ID":"86631848-2a82-4a9e-b733-a44cf268a2e1","Type":"ContainerStarted","Data":"b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d"} Nov 25 12:33:05 crc kubenswrapper[4693]: I1125 12:33:05.134888 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-55s94" podStartSLOduration=2.6876324499999997 podStartE2EDuration="5.134858112s" podCreationTimestamp="2025-11-25 12:33:00 +0000 UTC" firstStartedPulling="2025-11-25 12:33:02.056931659 +0000 UTC m=+1501.975017040" lastFinishedPulling="2025-11-25 12:33:04.504157321 +0000 UTC m=+1504.422242702" observedRunningTime="2025-11-25 12:33:05.122557975 +0000 UTC m=+1505.040643356" watchObservedRunningTime="2025-11-25 12:33:05.134858112 +0000 UTC m=+1505.052943493" Nov 25 12:33:06 crc kubenswrapper[4693]: I1125 12:33:06.520093 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.367922 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.480674 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qsr66" podStartSLOduration=7.910900569 podStartE2EDuration="10.480646076s" podCreationTimestamp="2025-11-25 12:32:58 +0000 UTC" firstStartedPulling="2025-11-25 12:33:02.054481344 +0000 UTC m=+1501.972566725" lastFinishedPulling="2025-11-25 12:33:04.624226851 +0000 UTC m=+1504.542312232" observedRunningTime="2025-11-25 12:33:05.151625281 +0000 UTC m=+1505.069710662" watchObservedRunningTime="2025-11-25 12:33:08.480646076 +0000 UTC m=+1508.398731457" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.496169 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc"] Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.497676 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.500498 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.500699 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.500805 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.504134 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.512725 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc"] Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.676046 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70ae1b8a-3af0-4d98-a633-6933a83b2b71-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65zzc\" (UID: \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.676117 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv9cn\" (UniqueName: \"kubernetes.io/projected/70ae1b8a-3af0-4d98-a633-6933a83b2b71-kube-api-access-pv9cn\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65zzc\" (UID: \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.676190 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70ae1b8a-3af0-4d98-a633-6933a83b2b71-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65zzc\" (UID: \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.777864 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70ae1b8a-3af0-4d98-a633-6933a83b2b71-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65zzc\" (UID: \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.777922 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv9cn\" (UniqueName: \"kubernetes.io/projected/70ae1b8a-3af0-4d98-a633-6933a83b2b71-kube-api-access-pv9cn\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65zzc\" (UID: \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.777997 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70ae1b8a-3af0-4d98-a633-6933a83b2b71-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65zzc\" (UID: \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.785204 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70ae1b8a-3af0-4d98-a633-6933a83b2b71-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65zzc\" (UID: \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.785787 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70ae1b8a-3af0-4d98-a633-6933a83b2b71-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65zzc\" (UID: \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.800357 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv9cn\" (UniqueName: \"kubernetes.io/projected/70ae1b8a-3af0-4d98-a633-6933a83b2b71-kube-api-access-pv9cn\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65zzc\" (UID: \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:08 crc kubenswrapper[4693]: I1125 12:33:08.828997 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:09 crc kubenswrapper[4693]: I1125 12:33:09.193201 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 25 12:33:09 crc kubenswrapper[4693]: W1125 12:33:09.652848 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70ae1b8a_3af0_4d98_a633_6933a83b2b71.slice/crio-5f47e9a83005094a376481867b961e2c4ffd990fc75faa0067b5c4b2378764d3 WatchSource:0}: Error finding container 5f47e9a83005094a376481867b961e2c4ffd990fc75faa0067b5c4b2378764d3: Status 404 returned error can't find the container with id 5f47e9a83005094a376481867b961e2c4ffd990fc75faa0067b5c4b2378764d3 Nov 25 12:33:09 crc kubenswrapper[4693]: I1125 12:33:09.656814 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc"] Nov 25 12:33:10 crc kubenswrapper[4693]: I1125 12:33:10.183497 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" event={"ID":"70ae1b8a-3af0-4d98-a633-6933a83b2b71","Type":"ContainerStarted","Data":"5f47e9a83005094a376481867b961e2c4ffd990fc75faa0067b5c4b2378764d3"} Nov 25 12:33:10 crc kubenswrapper[4693]: I1125 12:33:10.304008 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 25 12:33:10 crc kubenswrapper[4693]: I1125 12:33:10.946258 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.199600 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" event={"ID":"70ae1b8a-3af0-4d98-a633-6933a83b2b71","Type":"ContainerStarted","Data":"22f6bd209d3c07c63036a7f3f640b8489dfecc5593dfc41d7a56ab38c9d57312"} Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.255169 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" podStartSLOduration=2.713451701 podStartE2EDuration="3.255144583s" podCreationTimestamp="2025-11-25 12:33:08 +0000 UTC" firstStartedPulling="2025-11-25 12:33:09.655543345 +0000 UTC m=+1509.573628726" lastFinishedPulling="2025-11-25 12:33:10.197236227 +0000 UTC m=+1510.115321608" observedRunningTime="2025-11-25 12:33:11.248397008 +0000 UTC m=+1511.166482409" watchObservedRunningTime="2025-11-25 12:33:11.255144583 +0000 UTC m=+1511.173229964" Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.315278 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.315343 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.332701 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.332769 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.440763 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.831132 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tf8rj"] Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.833668 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.854127 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tf8rj"] Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.971437 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46fe51f7-8369-4689-a7fe-cf9a72784542-utilities\") pod \"community-operators-tf8rj\" (UID: \"46fe51f7-8369-4689-a7fe-cf9a72784542\") " pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.971535 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bffjq\" (UniqueName: \"kubernetes.io/projected/46fe51f7-8369-4689-a7fe-cf9a72784542-kube-api-access-bffjq\") pod \"community-operators-tf8rj\" (UID: \"46fe51f7-8369-4689-a7fe-cf9a72784542\") " pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:11 crc kubenswrapper[4693]: I1125 12:33:11.971700 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46fe51f7-8369-4689-a7fe-cf9a72784542-catalog-content\") pod \"community-operators-tf8rj\" (UID: \"46fe51f7-8369-4689-a7fe-cf9a72784542\") " pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.043108 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cb88v"] Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.089005 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46fe51f7-8369-4689-a7fe-cf9a72784542-utilities\") pod \"community-operators-tf8rj\" (UID: \"46fe51f7-8369-4689-a7fe-cf9a72784542\") " pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.089178 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bffjq\" (UniqueName: \"kubernetes.io/projected/46fe51f7-8369-4689-a7fe-cf9a72784542-kube-api-access-bffjq\") pod \"community-operators-tf8rj\" (UID: \"46fe51f7-8369-4689-a7fe-cf9a72784542\") " pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.089480 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46fe51f7-8369-4689-a7fe-cf9a72784542-catalog-content\") pod \"community-operators-tf8rj\" (UID: \"46fe51f7-8369-4689-a7fe-cf9a72784542\") " pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.090239 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46fe51f7-8369-4689-a7fe-cf9a72784542-catalog-content\") pod \"community-operators-tf8rj\" (UID: \"46fe51f7-8369-4689-a7fe-cf9a72784542\") " pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.090766 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46fe51f7-8369-4689-a7fe-cf9a72784542-utilities\") pod \"community-operators-tf8rj\" (UID: \"46fe51f7-8369-4689-a7fe-cf9a72784542\") " pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.114407 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.127980 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cb88v"] Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.160577 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bffjq\" (UniqueName: \"kubernetes.io/projected/46fe51f7-8369-4689-a7fe-cf9a72784542-kube-api-access-bffjq\") pod \"community-operators-tf8rj\" (UID: \"46fe51f7-8369-4689-a7fe-cf9a72784542\") " pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.169801 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.191640 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-utilities\") pod \"certified-operators-cb88v\" (UID: \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\") " pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.191709 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-catalog-content\") pod \"certified-operators-cb88v\" (UID: \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\") " pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.191871 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ncr6\" (UniqueName: \"kubernetes.io/projected/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-kube-api-access-5ncr6\") pod \"certified-operators-cb88v\" (UID: \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\") " pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.296045 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ncr6\" (UniqueName: \"kubernetes.io/projected/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-kube-api-access-5ncr6\") pod \"certified-operators-cb88v\" (UID: \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\") " pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.296733 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-utilities\") pod \"certified-operators-cb88v\" (UID: \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\") " pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.296767 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-catalog-content\") pod \"certified-operators-cb88v\" (UID: \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\") " pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.297575 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-catalog-content\") pod \"certified-operators-cb88v\" (UID: \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\") " pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.297723 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-utilities\") pod \"certified-operators-cb88v\" (UID: \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\") " pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.298144 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.319271 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ncr6\" (UniqueName: \"kubernetes.io/projected/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-kube-api-access-5ncr6\") pod \"certified-operators-cb88v\" (UID: \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\") " pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.458146 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qsr66" podUID="86631848-2a82-4a9e-b733-a44cf268a2e1" containerName="registry-server" probeResult="failure" output=< Nov 25 12:33:12 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 12:33:12 crc kubenswrapper[4693]: > Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.613490 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:12 crc kubenswrapper[4693]: W1125 12:33:12.798901 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46fe51f7_8369_4689_a7fe_cf9a72784542.slice/crio-1b0dddd62f04c966b280ffad281d0cb2d2cc439064b5b54c2cca19da17e0373b WatchSource:0}: Error finding container 1b0dddd62f04c966b280ffad281d0cb2d2cc439064b5b54c2cca19da17e0373b: Status 404 returned error can't find the container with id 1b0dddd62f04c966b280ffad281d0cb2d2cc439064b5b54c2cca19da17e0373b Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.800731 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tf8rj"] Nov 25 12:33:12 crc kubenswrapper[4693]: I1125 12:33:12.952607 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cb88v"] Nov 25 12:33:12 crc kubenswrapper[4693]: W1125 12:33:12.962012 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93c234c3_6c63_4cb0_b7a6_a8cbba9eacd4.slice/crio-b1d59bfc9851e2b2d6ee16c04e0aa694ac020e9561f1a8ec13ebe5b632d20b82 WatchSource:0}: Error finding container b1d59bfc9851e2b2d6ee16c04e0aa694ac020e9561f1a8ec13ebe5b632d20b82: Status 404 returned error can't find the container with id b1d59bfc9851e2b2d6ee16c04e0aa694ac020e9561f1a8ec13ebe5b632d20b82 Nov 25 12:33:13 crc kubenswrapper[4693]: I1125 12:33:13.231279 4693 generic.go:334] "Generic (PLEG): container finished" podID="93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" containerID="6af203052da3f8a94d54c0d1a3c5b4f133611d14eaea4d318f7bf0dae8804d8c" exitCode=0 Nov 25 12:33:13 crc kubenswrapper[4693]: I1125 12:33:13.231335 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb88v" event={"ID":"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4","Type":"ContainerDied","Data":"6af203052da3f8a94d54c0d1a3c5b4f133611d14eaea4d318f7bf0dae8804d8c"} Nov 25 12:33:13 crc kubenswrapper[4693]: I1125 12:33:13.231405 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb88v" event={"ID":"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4","Type":"ContainerStarted","Data":"b1d59bfc9851e2b2d6ee16c04e0aa694ac020e9561f1a8ec13ebe5b632d20b82"} Nov 25 12:33:13 crc kubenswrapper[4693]: I1125 12:33:13.236005 4693 generic.go:334] "Generic (PLEG): container finished" podID="46fe51f7-8369-4689-a7fe-cf9a72784542" containerID="367443c0b75014b52b6892ee7fa0ed013ac8e5b56eb2c4a05f26e135925ccf9a" exitCode=0 Nov 25 12:33:13 crc kubenswrapper[4693]: I1125 12:33:13.236185 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf8rj" event={"ID":"46fe51f7-8369-4689-a7fe-cf9a72784542","Type":"ContainerDied","Data":"367443c0b75014b52b6892ee7fa0ed013ac8e5b56eb2c4a05f26e135925ccf9a"} Nov 25 12:33:13 crc kubenswrapper[4693]: I1125 12:33:13.236224 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf8rj" event={"ID":"46fe51f7-8369-4689-a7fe-cf9a72784542","Type":"ContainerStarted","Data":"1b0dddd62f04c966b280ffad281d0cb2d2cc439064b5b54c2cca19da17e0373b"} Nov 25 12:33:13 crc kubenswrapper[4693]: I1125 12:33:13.519974 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 25 12:33:14 crc kubenswrapper[4693]: I1125 12:33:14.254341 4693 generic.go:334] "Generic (PLEG): container finished" podID="70ae1b8a-3af0-4d98-a633-6933a83b2b71" containerID="22f6bd209d3c07c63036a7f3f640b8489dfecc5593dfc41d7a56ab38c9d57312" exitCode=0 Nov 25 12:33:14 crc kubenswrapper[4693]: I1125 12:33:14.254411 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" event={"ID":"70ae1b8a-3af0-4d98-a633-6933a83b2b71","Type":"ContainerDied","Data":"22f6bd209d3c07c63036a7f3f640b8489dfecc5593dfc41d7a56ab38c9d57312"} Nov 25 12:33:14 crc kubenswrapper[4693]: I1125 12:33:14.258458 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf8rj" event={"ID":"46fe51f7-8369-4689-a7fe-cf9a72784542","Type":"ContainerStarted","Data":"889a7afbfb699e1c36e29949d963b41bd3b8de1d1775b9ef64838e4027cf5ddf"} Nov 25 12:33:14 crc kubenswrapper[4693]: I1125 12:33:14.480100 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 25 12:33:14 crc kubenswrapper[4693]: I1125 12:33:14.972531 4693 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.272769 4693 generic.go:334] "Generic (PLEG): container finished" podID="46fe51f7-8369-4689-a7fe-cf9a72784542" containerID="889a7afbfb699e1c36e29949d963b41bd3b8de1d1775b9ef64838e4027cf5ddf" exitCode=0 Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.272825 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf8rj" event={"ID":"46fe51f7-8369-4689-a7fe-cf9a72784542","Type":"ContainerDied","Data":"889a7afbfb699e1c36e29949d963b41bd3b8de1d1775b9ef64838e4027cf5ddf"} Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.277084 4693 generic.go:334] "Generic (PLEG): container finished" podID="93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" containerID="6f2cb97a358ca8699f3983ac328066ffea94240dd3efbd57d6e9ea14e71f2a6f" exitCode=0 Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.278055 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb88v" event={"ID":"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4","Type":"ContainerDied","Data":"6f2cb97a358ca8699f3983ac328066ffea94240dd3efbd57d6e9ea14e71f2a6f"} Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.440493 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dfqrn"] Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.442758 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.457870 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfqrn"] Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.572386 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-catalog-content\") pod \"redhat-marketplace-dfqrn\" (UID: \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\") " pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.572770 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htk8z\" (UniqueName: \"kubernetes.io/projected/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-kube-api-access-htk8z\") pod \"redhat-marketplace-dfqrn\" (UID: \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\") " pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.572973 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-utilities\") pod \"redhat-marketplace-dfqrn\" (UID: \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\") " pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.675249 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htk8z\" (UniqueName: \"kubernetes.io/projected/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-kube-api-access-htk8z\") pod \"redhat-marketplace-dfqrn\" (UID: \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\") " pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.675326 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-utilities\") pod \"redhat-marketplace-dfqrn\" (UID: \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\") " pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.675408 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-catalog-content\") pod \"redhat-marketplace-dfqrn\" (UID: \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\") " pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.675908 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-catalog-content\") pod \"redhat-marketplace-dfqrn\" (UID: \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\") " pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.676153 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-utilities\") pod \"redhat-marketplace-dfqrn\" (UID: \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\") " pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.699612 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htk8z\" (UniqueName: \"kubernetes.io/projected/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-kube-api-access-htk8z\") pod \"redhat-marketplace-dfqrn\" (UID: \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\") " pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.779836 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.822932 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.980475 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv9cn\" (UniqueName: \"kubernetes.io/projected/70ae1b8a-3af0-4d98-a633-6933a83b2b71-kube-api-access-pv9cn\") pod \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\" (UID: \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\") " Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.980582 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70ae1b8a-3af0-4d98-a633-6933a83b2b71-inventory\") pod \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\" (UID: \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\") " Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.980609 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70ae1b8a-3af0-4d98-a633-6933a83b2b71-ssh-key\") pod \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\" (UID: \"70ae1b8a-3af0-4d98-a633-6933a83b2b71\") " Nov 25 12:33:15 crc kubenswrapper[4693]: I1125 12:33:15.988220 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ae1b8a-3af0-4d98-a633-6933a83b2b71-kube-api-access-pv9cn" (OuterVolumeSpecName: "kube-api-access-pv9cn") pod "70ae1b8a-3af0-4d98-a633-6933a83b2b71" (UID: "70ae1b8a-3af0-4d98-a633-6933a83b2b71"). InnerVolumeSpecName "kube-api-access-pv9cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.046994 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ae1b8a-3af0-4d98-a633-6933a83b2b71-inventory" (OuterVolumeSpecName: "inventory") pod "70ae1b8a-3af0-4d98-a633-6933a83b2b71" (UID: "70ae1b8a-3af0-4d98-a633-6933a83b2b71"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.060761 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70ae1b8a-3af0-4d98-a633-6933a83b2b71-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "70ae1b8a-3af0-4d98-a633-6933a83b2b71" (UID: "70ae1b8a-3af0-4d98-a633-6933a83b2b71"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.085090 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/70ae1b8a-3af0-4d98-a633-6933a83b2b71-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.085133 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/70ae1b8a-3af0-4d98-a633-6933a83b2b71-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.085145 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv9cn\" (UniqueName: \"kubernetes.io/projected/70ae1b8a-3af0-4d98-a633-6933a83b2b71-kube-api-access-pv9cn\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.256965 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfqrn"] Nov 25 12:33:16 crc kubenswrapper[4693]: W1125 12:33:16.260688 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ebd062b_7782_43d3_ab46_dc4ee8c93b12.slice/crio-0e971296924d5373b6d21e51d7c6dfafbc2117b124617e6a3ca5895330b2a1d0 WatchSource:0}: Error finding container 0e971296924d5373b6d21e51d7c6dfafbc2117b124617e6a3ca5895330b2a1d0: Status 404 returned error can't find the container with id 0e971296924d5373b6d21e51d7c6dfafbc2117b124617e6a3ca5895330b2a1d0 Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.317304 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" event={"ID":"70ae1b8a-3af0-4d98-a633-6933a83b2b71","Type":"ContainerDied","Data":"5f47e9a83005094a376481867b961e2c4ffd990fc75faa0067b5c4b2378764d3"} Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.317448 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f47e9a83005094a376481867b961e2c4ffd990fc75faa0067b5c4b2378764d3" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.317605 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65zzc" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.337518 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfqrn" event={"ID":"2ebd062b-7782-43d3-ab46-dc4ee8c93b12","Type":"ContainerStarted","Data":"0e971296924d5373b6d21e51d7c6dfafbc2117b124617e6a3ca5895330b2a1d0"} Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.490442 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97"] Nov 25 12:33:16 crc kubenswrapper[4693]: E1125 12:33:16.491070 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ae1b8a-3af0-4d98-a633-6933a83b2b71" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.491098 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ae1b8a-3af0-4d98-a633-6933a83b2b71" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.491417 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ae1b8a-3af0-4d98-a633-6933a83b2b71" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.492243 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.504796 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.505103 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.505268 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.505498 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.513533 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97"] Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.596600 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46z97\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.596848 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l98wb\" (UniqueName: \"kubernetes.io/projected/0c125840-c37c-445e-95d9-37c74703ea85-kube-api-access-l98wb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46z97\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.597243 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46z97\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.597491 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46z97\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.699395 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46z97\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.699532 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46z97\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.699559 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46z97\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.699599 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l98wb\" (UniqueName: \"kubernetes.io/projected/0c125840-c37c-445e-95d9-37c74703ea85-kube-api-access-l98wb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46z97\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.708569 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46z97\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.708721 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46z97\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.711362 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46z97\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.730072 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l98wb\" (UniqueName: \"kubernetes.io/projected/0c125840-c37c-445e-95d9-37c74703ea85-kube-api-access-l98wb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-46z97\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:16 crc kubenswrapper[4693]: I1125 12:33:16.867084 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:33:17 crc kubenswrapper[4693]: I1125 12:33:17.353078 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf8rj" event={"ID":"46fe51f7-8369-4689-a7fe-cf9a72784542","Type":"ContainerStarted","Data":"4f19012168fe8923a70a3e09db29fd40db13bb69b59282161f1ade9afb029eef"} Nov 25 12:33:17 crc kubenswrapper[4693]: I1125 12:33:17.355982 4693 generic.go:334] "Generic (PLEG): container finished" podID="2ebd062b-7782-43d3-ab46-dc4ee8c93b12" containerID="94d2a9367b8c19aa74bb77bfcc349da2ba58cd942aa0268c38485a5e9aa3909c" exitCode=0 Nov 25 12:33:17 crc kubenswrapper[4693]: I1125 12:33:17.356077 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfqrn" event={"ID":"2ebd062b-7782-43d3-ab46-dc4ee8c93b12","Type":"ContainerDied","Data":"94d2a9367b8c19aa74bb77bfcc349da2ba58cd942aa0268c38485a5e9aa3909c"} Nov 25 12:33:17 crc kubenswrapper[4693]: I1125 12:33:17.364636 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb88v" event={"ID":"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4","Type":"ContainerStarted","Data":"673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7"} Nov 25 12:33:17 crc kubenswrapper[4693]: I1125 12:33:17.380103 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tf8rj" podStartSLOduration=3.824272267 podStartE2EDuration="6.380087399s" podCreationTimestamp="2025-11-25 12:33:11 +0000 UTC" firstStartedPulling="2025-11-25 12:33:13.237857313 +0000 UTC m=+1513.155942694" lastFinishedPulling="2025-11-25 12:33:15.793672445 +0000 UTC m=+1515.711757826" observedRunningTime="2025-11-25 12:33:17.373460927 +0000 UTC m=+1517.291546308" watchObservedRunningTime="2025-11-25 12:33:17.380087399 +0000 UTC m=+1517.298172780" Nov 25 12:33:17 crc kubenswrapper[4693]: I1125 12:33:17.426196 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cb88v" podStartSLOduration=2.785656107 podStartE2EDuration="5.426173717s" podCreationTimestamp="2025-11-25 12:33:12 +0000 UTC" firstStartedPulling="2025-11-25 12:33:13.233250069 +0000 UTC m=+1513.151335450" lastFinishedPulling="2025-11-25 12:33:15.873767689 +0000 UTC m=+1515.791853060" observedRunningTime="2025-11-25 12:33:17.411224492 +0000 UTC m=+1517.329309893" watchObservedRunningTime="2025-11-25 12:33:17.426173717 +0000 UTC m=+1517.344259108" Nov 25 12:33:17 crc kubenswrapper[4693]: W1125 12:33:17.584938 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c125840_c37c_445e_95d9_37c74703ea85.slice/crio-3b9ea55446bd5004d75d859f5967ef760ca07c7fdb2e73e1bcc222f0f1aed413 WatchSource:0}: Error finding container 3b9ea55446bd5004d75d859f5967ef760ca07c7fdb2e73e1bcc222f0f1aed413: Status 404 returned error can't find the container with id 3b9ea55446bd5004d75d859f5967ef760ca07c7fdb2e73e1bcc222f0f1aed413 Nov 25 12:33:17 crc kubenswrapper[4693]: I1125 12:33:17.588457 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97"] Nov 25 12:33:17 crc kubenswrapper[4693]: I1125 12:33:17.711349 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5995bbfc5f-c8gkc" Nov 25 12:33:18 crc kubenswrapper[4693]: I1125 12:33:18.390037 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" event={"ID":"0c125840-c37c-445e-95d9-37c74703ea85","Type":"ContainerStarted","Data":"3b9ea55446bd5004d75d859f5967ef760ca07c7fdb2e73e1bcc222f0f1aed413"} Nov 25 12:33:19 crc kubenswrapper[4693]: I1125 12:33:19.425230 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" event={"ID":"0c125840-c37c-445e-95d9-37c74703ea85","Type":"ContainerStarted","Data":"6455ecb7623988988ae65d8c44655e5a2150e26855bffaf2d6b010a29399019a"} Nov 25 12:33:19 crc kubenswrapper[4693]: I1125 12:33:19.431080 4693 generic.go:334] "Generic (PLEG): container finished" podID="2ebd062b-7782-43d3-ab46-dc4ee8c93b12" containerID="6c3e817d12933eb169fde50f313f84743dffa12f3230e3533c233906bf8712a1" exitCode=0 Nov 25 12:33:19 crc kubenswrapper[4693]: I1125 12:33:19.431297 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfqrn" event={"ID":"2ebd062b-7782-43d3-ab46-dc4ee8c93b12","Type":"ContainerDied","Data":"6c3e817d12933eb169fde50f313f84743dffa12f3230e3533c233906bf8712a1"} Nov 25 12:33:19 crc kubenswrapper[4693]: I1125 12:33:19.448339 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" podStartSLOduration=2.676411985 podStartE2EDuration="3.448309525s" podCreationTimestamp="2025-11-25 12:33:16 +0000 UTC" firstStartedPulling="2025-11-25 12:33:17.588091228 +0000 UTC m=+1517.506176609" lastFinishedPulling="2025-11-25 12:33:18.359988778 +0000 UTC m=+1518.278074149" observedRunningTime="2025-11-25 12:33:19.446270396 +0000 UTC m=+1519.364355777" watchObservedRunningTime="2025-11-25 12:33:19.448309525 +0000 UTC m=+1519.366394916" Nov 25 12:33:20 crc kubenswrapper[4693]: I1125 12:33:20.443970 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfqrn" event={"ID":"2ebd062b-7782-43d3-ab46-dc4ee8c93b12","Type":"ContainerStarted","Data":"e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33"} Nov 25 12:33:20 crc kubenswrapper[4693]: I1125 12:33:20.470477 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dfqrn" podStartSLOduration=2.796209815 podStartE2EDuration="5.470460126s" podCreationTimestamp="2025-11-25 12:33:15 +0000 UTC" firstStartedPulling="2025-11-25 12:33:17.359311247 +0000 UTC m=+1517.277396628" lastFinishedPulling="2025-11-25 12:33:20.033561558 +0000 UTC m=+1519.951646939" observedRunningTime="2025-11-25 12:33:20.468491481 +0000 UTC m=+1520.386576862" watchObservedRunningTime="2025-11-25 12:33:20.470460126 +0000 UTC m=+1520.388545507" Nov 25 12:33:21 crc kubenswrapper[4693]: I1125 12:33:21.425094 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:21 crc kubenswrapper[4693]: I1125 12:33:21.517599 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:22 crc kubenswrapper[4693]: I1125 12:33:22.171354 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:22 crc kubenswrapper[4693]: I1125 12:33:22.171438 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:22 crc kubenswrapper[4693]: I1125 12:33:22.233798 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:22 crc kubenswrapper[4693]: I1125 12:33:22.508961 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:22 crc kubenswrapper[4693]: I1125 12:33:22.614261 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:22 crc kubenswrapper[4693]: I1125 12:33:22.614330 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:22 crc kubenswrapper[4693]: I1125 12:33:22.667690 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:23 crc kubenswrapper[4693]: I1125 12:33:23.562022 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:25 crc kubenswrapper[4693]: I1125 12:33:25.780850 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:25 crc kubenswrapper[4693]: I1125 12:33:25.781180 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:25 crc kubenswrapper[4693]: I1125 12:33:25.844128 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:26 crc kubenswrapper[4693]: I1125 12:33:26.033340 4693 scope.go:117] "RemoveContainer" containerID="2eb4cf38a68361738c7dbe495c83e8a0047e30c72dc62f70f6869ed112b42ddd" Nov 25 12:33:26 crc kubenswrapper[4693]: I1125 12:33:26.062518 4693 scope.go:117] "RemoveContainer" containerID="961aaf75dc8888f81ceeb9e48896e6b94d151b30cf1e38ef118b0cbec141e375" Nov 25 12:33:26 crc kubenswrapper[4693]: I1125 12:33:26.127646 4693 scope.go:117] "RemoveContainer" containerID="bc7e5d881cdd7f561933c7688600977769a0fecd40b302575bb4ea5fd25ae2d9" Nov 25 12:33:26 crc kubenswrapper[4693]: I1125 12:33:26.575611 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:35 crc kubenswrapper[4693]: I1125 12:33:35.113666 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:33:35 crc kubenswrapper[4693]: I1125 12:33:35.114188 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:33:47 crc kubenswrapper[4693]: I1125 12:33:47.632318 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cb88v"] Nov 25 12:33:47 crc kubenswrapper[4693]: I1125 12:33:47.633283 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cb88v" podUID="93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" containerName="registry-server" containerID="cri-o://673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7" gracePeriod=2 Nov 25 12:33:47 crc kubenswrapper[4693]: I1125 12:33:47.831687 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-55s94"] Nov 25 12:33:47 crc kubenswrapper[4693]: I1125 12:33:47.832011 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-55s94" podUID="0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" containerName="registry-server" containerID="cri-o://28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375" gracePeriod=2 Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.192227 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.332241 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ncr6\" (UniqueName: \"kubernetes.io/projected/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-kube-api-access-5ncr6\") pod \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\" (UID: \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\") " Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.332328 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-catalog-content\") pod \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\" (UID: \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\") " Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.332417 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-utilities\") pod \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\" (UID: \"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4\") " Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.333469 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-utilities" (OuterVolumeSpecName: "utilities") pod "93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" (UID: "93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.354548 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-kube-api-access-5ncr6" (OuterVolumeSpecName: "kube-api-access-5ncr6") pod "93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" (UID: "93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4"). InnerVolumeSpecName "kube-api-access-5ncr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.400017 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" (UID: "93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.437545 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ncr6\" (UniqueName: \"kubernetes.io/projected/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-kube-api-access-5ncr6\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.437576 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.437586 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.440664 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tf8rj"] Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.441139 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tf8rj" podUID="46fe51f7-8369-4689-a7fe-cf9a72784542" containerName="registry-server" containerID="cri-o://4f19012168fe8923a70a3e09db29fd40db13bb69b59282161f1ade9afb029eef" gracePeriod=2 Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.452246 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.540053 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-catalog-content\") pod \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\" (UID: \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\") " Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.611049 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" (UID: "0abeb214-1ea0-49cc-bcda-77f91bfcb5ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.641312 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-utilities\") pod \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\" (UID: \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\") " Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.643156 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptmwz\" (UniqueName: \"kubernetes.io/projected/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-kube-api-access-ptmwz\") pod \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\" (UID: \"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef\") " Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.641807 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-utilities" (OuterVolumeSpecName: "utilities") pod "0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" (UID: "0abeb214-1ea0-49cc-bcda-77f91bfcb5ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.643978 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.644062 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.649713 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-kube-api-access-ptmwz" (OuterVolumeSpecName: "kube-api-access-ptmwz") pod "0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" (UID: "0abeb214-1ea0-49cc-bcda-77f91bfcb5ef"). InnerVolumeSpecName "kube-api-access-ptmwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.745665 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptmwz\" (UniqueName: \"kubernetes.io/projected/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef-kube-api-access-ptmwz\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.762674 4693 generic.go:334] "Generic (PLEG): container finished" podID="0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" containerID="28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375" exitCode=0 Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.762755 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-55s94" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.762764 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55s94" event={"ID":"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef","Type":"ContainerDied","Data":"28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375"} Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.762932 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-55s94" event={"ID":"0abeb214-1ea0-49cc-bcda-77f91bfcb5ef","Type":"ContainerDied","Data":"39d6f942efd1fe19b839d3f04affc0160991c996bc38a23299499bef11cae271"} Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.762967 4693 scope.go:117] "RemoveContainer" containerID="28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.766764 4693 generic.go:334] "Generic (PLEG): container finished" podID="93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" containerID="673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7" exitCode=0 Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.766850 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb88v" event={"ID":"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4","Type":"ContainerDied","Data":"673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7"} Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.766916 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cb88v" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.766923 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb88v" event={"ID":"93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4","Type":"ContainerDied","Data":"b1d59bfc9851e2b2d6ee16c04e0aa694ac020e9561f1a8ec13ebe5b632d20b82"} Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.782596 4693 generic.go:334] "Generic (PLEG): container finished" podID="46fe51f7-8369-4689-a7fe-cf9a72784542" containerID="4f19012168fe8923a70a3e09db29fd40db13bb69b59282161f1ade9afb029eef" exitCode=0 Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.782638 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf8rj" event={"ID":"46fe51f7-8369-4689-a7fe-cf9a72784542","Type":"ContainerDied","Data":"4f19012168fe8923a70a3e09db29fd40db13bb69b59282161f1ade9afb029eef"} Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.819262 4693 scope.go:117] "RemoveContainer" containerID="2cc1f9463801889179f5ca76ff637b11718c79405db701059f2e4fb854a05c64" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.868740 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-55s94"] Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.888123 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-55s94"] Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.898630 4693 scope.go:117] "RemoveContainer" containerID="8a747428058f97fe53f4ab6f66974070a077721006f1031c7e0292d42a6f05a1" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.901579 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cb88v"] Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.913601 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cb88v"] Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.929250 4693 scope.go:117] "RemoveContainer" containerID="28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375" Nov 25 12:33:48 crc kubenswrapper[4693]: E1125 12:33:48.930646 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375\": container with ID starting with 28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375 not found: ID does not exist" containerID="28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.930680 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375"} err="failed to get container status \"28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375\": rpc error: code = NotFound desc = could not find container \"28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375\": container with ID starting with 28b88153bf4766b5752253d8b0e9e18502546cacf5287bace36402b9622ca375 not found: ID does not exist" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.930703 4693 scope.go:117] "RemoveContainer" containerID="2cc1f9463801889179f5ca76ff637b11718c79405db701059f2e4fb854a05c64" Nov 25 12:33:48 crc kubenswrapper[4693]: E1125 12:33:48.931045 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc1f9463801889179f5ca76ff637b11718c79405db701059f2e4fb854a05c64\": container with ID starting with 2cc1f9463801889179f5ca76ff637b11718c79405db701059f2e4fb854a05c64 not found: ID does not exist" containerID="2cc1f9463801889179f5ca76ff637b11718c79405db701059f2e4fb854a05c64" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.931071 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc1f9463801889179f5ca76ff637b11718c79405db701059f2e4fb854a05c64"} err="failed to get container status \"2cc1f9463801889179f5ca76ff637b11718c79405db701059f2e4fb854a05c64\": rpc error: code = NotFound desc = could not find container \"2cc1f9463801889179f5ca76ff637b11718c79405db701059f2e4fb854a05c64\": container with ID starting with 2cc1f9463801889179f5ca76ff637b11718c79405db701059f2e4fb854a05c64 not found: ID does not exist" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.931084 4693 scope.go:117] "RemoveContainer" containerID="8a747428058f97fe53f4ab6f66974070a077721006f1031c7e0292d42a6f05a1" Nov 25 12:33:48 crc kubenswrapper[4693]: E1125 12:33:48.931313 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a747428058f97fe53f4ab6f66974070a077721006f1031c7e0292d42a6f05a1\": container with ID starting with 8a747428058f97fe53f4ab6f66974070a077721006f1031c7e0292d42a6f05a1 not found: ID does not exist" containerID="8a747428058f97fe53f4ab6f66974070a077721006f1031c7e0292d42a6f05a1" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.931338 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a747428058f97fe53f4ab6f66974070a077721006f1031c7e0292d42a6f05a1"} err="failed to get container status \"8a747428058f97fe53f4ab6f66974070a077721006f1031c7e0292d42a6f05a1\": rpc error: code = NotFound desc = could not find container \"8a747428058f97fe53f4ab6f66974070a077721006f1031c7e0292d42a6f05a1\": container with ID starting with 8a747428058f97fe53f4ab6f66974070a077721006f1031c7e0292d42a6f05a1 not found: ID does not exist" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.931350 4693 scope.go:117] "RemoveContainer" containerID="673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7" Nov 25 12:33:48 crc kubenswrapper[4693]: I1125 12:33:48.972159 4693 scope.go:117] "RemoveContainer" containerID="6f2cb97a358ca8699f3983ac328066ffea94240dd3efbd57d6e9ea14e71f2a6f" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.071150 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.080397 4693 scope.go:117] "RemoveContainer" containerID="6af203052da3f8a94d54c0d1a3c5b4f133611d14eaea4d318f7bf0dae8804d8c" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.112080 4693 scope.go:117] "RemoveContainer" containerID="673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7" Nov 25 12:33:49 crc kubenswrapper[4693]: E1125 12:33:49.112965 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7\": container with ID starting with 673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7 not found: ID does not exist" containerID="673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.112993 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7"} err="failed to get container status \"673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7\": rpc error: code = NotFound desc = could not find container \"673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7\": container with ID starting with 673e47b3d2e098afefea213776a80f466e08f20458eef6463265fb63543c9bc7 not found: ID does not exist" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.113013 4693 scope.go:117] "RemoveContainer" containerID="6f2cb97a358ca8699f3983ac328066ffea94240dd3efbd57d6e9ea14e71f2a6f" Nov 25 12:33:49 crc kubenswrapper[4693]: E1125 12:33:49.113363 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2cb97a358ca8699f3983ac328066ffea94240dd3efbd57d6e9ea14e71f2a6f\": container with ID starting with 6f2cb97a358ca8699f3983ac328066ffea94240dd3efbd57d6e9ea14e71f2a6f not found: ID does not exist" containerID="6f2cb97a358ca8699f3983ac328066ffea94240dd3efbd57d6e9ea14e71f2a6f" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.113402 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2cb97a358ca8699f3983ac328066ffea94240dd3efbd57d6e9ea14e71f2a6f"} err="failed to get container status \"6f2cb97a358ca8699f3983ac328066ffea94240dd3efbd57d6e9ea14e71f2a6f\": rpc error: code = NotFound desc = could not find container \"6f2cb97a358ca8699f3983ac328066ffea94240dd3efbd57d6e9ea14e71f2a6f\": container with ID starting with 6f2cb97a358ca8699f3983ac328066ffea94240dd3efbd57d6e9ea14e71f2a6f not found: ID does not exist" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.113418 4693 scope.go:117] "RemoveContainer" containerID="6af203052da3f8a94d54c0d1a3c5b4f133611d14eaea4d318f7bf0dae8804d8c" Nov 25 12:33:49 crc kubenswrapper[4693]: E1125 12:33:49.114829 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af203052da3f8a94d54c0d1a3c5b4f133611d14eaea4d318f7bf0dae8804d8c\": container with ID starting with 6af203052da3f8a94d54c0d1a3c5b4f133611d14eaea4d318f7bf0dae8804d8c not found: ID does not exist" containerID="6af203052da3f8a94d54c0d1a3c5b4f133611d14eaea4d318f7bf0dae8804d8c" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.114851 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af203052da3f8a94d54c0d1a3c5b4f133611d14eaea4d318f7bf0dae8804d8c"} err="failed to get container status \"6af203052da3f8a94d54c0d1a3c5b4f133611d14eaea4d318f7bf0dae8804d8c\": rpc error: code = NotFound desc = could not find container \"6af203052da3f8a94d54c0d1a3c5b4f133611d14eaea4d318f7bf0dae8804d8c\": container with ID starting with 6af203052da3f8a94d54c0d1a3c5b4f133611d14eaea4d318f7bf0dae8804d8c not found: ID does not exist" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.256939 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46fe51f7-8369-4689-a7fe-cf9a72784542-utilities\") pod \"46fe51f7-8369-4689-a7fe-cf9a72784542\" (UID: \"46fe51f7-8369-4689-a7fe-cf9a72784542\") " Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.257095 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bffjq\" (UniqueName: \"kubernetes.io/projected/46fe51f7-8369-4689-a7fe-cf9a72784542-kube-api-access-bffjq\") pod \"46fe51f7-8369-4689-a7fe-cf9a72784542\" (UID: \"46fe51f7-8369-4689-a7fe-cf9a72784542\") " Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.257159 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46fe51f7-8369-4689-a7fe-cf9a72784542-catalog-content\") pod \"46fe51f7-8369-4689-a7fe-cf9a72784542\" (UID: \"46fe51f7-8369-4689-a7fe-cf9a72784542\") " Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.257603 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46fe51f7-8369-4689-a7fe-cf9a72784542-utilities" (OuterVolumeSpecName: "utilities") pod "46fe51f7-8369-4689-a7fe-cf9a72784542" (UID: "46fe51f7-8369-4689-a7fe-cf9a72784542"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.261977 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46fe51f7-8369-4689-a7fe-cf9a72784542-kube-api-access-bffjq" (OuterVolumeSpecName: "kube-api-access-bffjq") pod "46fe51f7-8369-4689-a7fe-cf9a72784542" (UID: "46fe51f7-8369-4689-a7fe-cf9a72784542"). InnerVolumeSpecName "kube-api-access-bffjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.296678 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46fe51f7-8369-4689-a7fe-cf9a72784542-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46fe51f7-8369-4689-a7fe-cf9a72784542" (UID: "46fe51f7-8369-4689-a7fe-cf9a72784542"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.359891 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bffjq\" (UniqueName: \"kubernetes.io/projected/46fe51f7-8369-4689-a7fe-cf9a72784542-kube-api-access-bffjq\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.359935 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46fe51f7-8369-4689-a7fe-cf9a72784542-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.359948 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46fe51f7-8369-4689-a7fe-cf9a72784542-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.798540 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf8rj" event={"ID":"46fe51f7-8369-4689-a7fe-cf9a72784542","Type":"ContainerDied","Data":"1b0dddd62f04c966b280ffad281d0cb2d2cc439064b5b54c2cca19da17e0373b"} Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.798588 4693 scope.go:117] "RemoveContainer" containerID="4f19012168fe8923a70a3e09db29fd40db13bb69b59282161f1ade9afb029eef" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.798666 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf8rj" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.845945 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tf8rj"] Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.847426 4693 scope.go:117] "RemoveContainer" containerID="889a7afbfb699e1c36e29949d963b41bd3b8de1d1775b9ef64838e4027cf5ddf" Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.860830 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tf8rj"] Nov 25 12:33:49 crc kubenswrapper[4693]: I1125 12:33:49.868896 4693 scope.go:117] "RemoveContainer" containerID="367443c0b75014b52b6892ee7fa0ed013ac8e5b56eb2c4a05f26e135925ccf9a" Nov 25 12:33:50 crc kubenswrapper[4693]: I1125 12:33:50.830331 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" path="/var/lib/kubelet/pods/0abeb214-1ea0-49cc-bcda-77f91bfcb5ef/volumes" Nov 25 12:33:50 crc kubenswrapper[4693]: I1125 12:33:50.831434 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46fe51f7-8369-4689-a7fe-cf9a72784542" path="/var/lib/kubelet/pods/46fe51f7-8369-4689-a7fe-cf9a72784542/volumes" Nov 25 12:33:50 crc kubenswrapper[4693]: I1125 12:33:50.832259 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" path="/var/lib/kubelet/pods/93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4/volumes" Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.230793 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfqrn"] Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.231100 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dfqrn" podUID="2ebd062b-7782-43d3-ab46-dc4ee8c93b12" containerName="registry-server" containerID="cri-o://e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33" gracePeriod=2 Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.790208 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.826517 4693 generic.go:334] "Generic (PLEG): container finished" podID="2ebd062b-7782-43d3-ab46-dc4ee8c93b12" containerID="e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33" exitCode=0 Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.826569 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfqrn" event={"ID":"2ebd062b-7782-43d3-ab46-dc4ee8c93b12","Type":"ContainerDied","Data":"e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33"} Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.826603 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfqrn" event={"ID":"2ebd062b-7782-43d3-ab46-dc4ee8c93b12","Type":"ContainerDied","Data":"0e971296924d5373b6d21e51d7c6dfafbc2117b124617e6a3ca5895330b2a1d0"} Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.826623 4693 scope.go:117] "RemoveContainer" containerID="e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33" Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.826789 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfqrn" Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.834331 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qsr66"] Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.834701 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qsr66" podUID="86631848-2a82-4a9e-b733-a44cf268a2e1" containerName="registry-server" containerID="cri-o://b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d" gracePeriod=2 Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.876003 4693 scope.go:117] "RemoveContainer" containerID="6c3e817d12933eb169fde50f313f84743dffa12f3230e3533c233906bf8712a1" Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.906705 4693 scope.go:117] "RemoveContainer" containerID="94d2a9367b8c19aa74bb77bfcc349da2ba58cd942aa0268c38485a5e9aa3909c" Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.907184 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-catalog-content\") pod \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\" (UID: \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\") " Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.907266 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-utilities\") pod \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\" (UID: \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\") " Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.907313 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htk8z\" (UniqueName: \"kubernetes.io/projected/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-kube-api-access-htk8z\") pod \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\" (UID: \"2ebd062b-7782-43d3-ab46-dc4ee8c93b12\") " Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.908038 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-utilities" (OuterVolumeSpecName: "utilities") pod "2ebd062b-7782-43d3-ab46-dc4ee8c93b12" (UID: "2ebd062b-7782-43d3-ab46-dc4ee8c93b12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.913826 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-kube-api-access-htk8z" (OuterVolumeSpecName: "kube-api-access-htk8z") pod "2ebd062b-7782-43d3-ab46-dc4ee8c93b12" (UID: "2ebd062b-7782-43d3-ab46-dc4ee8c93b12"). InnerVolumeSpecName "kube-api-access-htk8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:33:51 crc kubenswrapper[4693]: I1125 12:33:51.923050 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ebd062b-7782-43d3-ab46-dc4ee8c93b12" (UID: "2ebd062b-7782-43d3-ab46-dc4ee8c93b12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.009882 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.009949 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.009982 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htk8z\" (UniqueName: \"kubernetes.io/projected/2ebd062b-7782-43d3-ab46-dc4ee8c93b12-kube-api-access-htk8z\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.067341 4693 scope.go:117] "RemoveContainer" containerID="e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33" Nov 25 12:33:52 crc kubenswrapper[4693]: E1125 12:33:52.069108 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33\": container with ID starting with e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33 not found: ID does not exist" containerID="e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.069173 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33"} err="failed to get container status \"e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33\": rpc error: code = NotFound desc = could not find container \"e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33\": container with ID starting with e2aa109ea8fa1db72c08371a78dbcd4a274c26fc3e6c11cf8e12e9dfe45beb33 not found: ID does not exist" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.069215 4693 scope.go:117] "RemoveContainer" containerID="6c3e817d12933eb169fde50f313f84743dffa12f3230e3533c233906bf8712a1" Nov 25 12:33:52 crc kubenswrapper[4693]: E1125 12:33:52.069611 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c3e817d12933eb169fde50f313f84743dffa12f3230e3533c233906bf8712a1\": container with ID starting with 6c3e817d12933eb169fde50f313f84743dffa12f3230e3533c233906bf8712a1 not found: ID does not exist" containerID="6c3e817d12933eb169fde50f313f84743dffa12f3230e3533c233906bf8712a1" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.069632 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c3e817d12933eb169fde50f313f84743dffa12f3230e3533c233906bf8712a1"} err="failed to get container status \"6c3e817d12933eb169fde50f313f84743dffa12f3230e3533c233906bf8712a1\": rpc error: code = NotFound desc = could not find container \"6c3e817d12933eb169fde50f313f84743dffa12f3230e3533c233906bf8712a1\": container with ID starting with 6c3e817d12933eb169fde50f313f84743dffa12f3230e3533c233906bf8712a1 not found: ID does not exist" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.069651 4693 scope.go:117] "RemoveContainer" containerID="94d2a9367b8c19aa74bb77bfcc349da2ba58cd942aa0268c38485a5e9aa3909c" Nov 25 12:33:52 crc kubenswrapper[4693]: E1125 12:33:52.070225 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d2a9367b8c19aa74bb77bfcc349da2ba58cd942aa0268c38485a5e9aa3909c\": container with ID starting with 94d2a9367b8c19aa74bb77bfcc349da2ba58cd942aa0268c38485a5e9aa3909c not found: ID does not exist" containerID="94d2a9367b8c19aa74bb77bfcc349da2ba58cd942aa0268c38485a5e9aa3909c" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.070265 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d2a9367b8c19aa74bb77bfcc349da2ba58cd942aa0268c38485a5e9aa3909c"} err="failed to get container status \"94d2a9367b8c19aa74bb77bfcc349da2ba58cd942aa0268c38485a5e9aa3909c\": rpc error: code = NotFound desc = could not find container \"94d2a9367b8c19aa74bb77bfcc349da2ba58cd942aa0268c38485a5e9aa3909c\": container with ID starting with 94d2a9367b8c19aa74bb77bfcc349da2ba58cd942aa0268c38485a5e9aa3909c not found: ID does not exist" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.170675 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfqrn"] Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.179758 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfqrn"] Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.303804 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.416666 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hcf9\" (UniqueName: \"kubernetes.io/projected/86631848-2a82-4a9e-b733-a44cf268a2e1-kube-api-access-7hcf9\") pod \"86631848-2a82-4a9e-b733-a44cf268a2e1\" (UID: \"86631848-2a82-4a9e-b733-a44cf268a2e1\") " Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.416860 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86631848-2a82-4a9e-b733-a44cf268a2e1-catalog-content\") pod \"86631848-2a82-4a9e-b733-a44cf268a2e1\" (UID: \"86631848-2a82-4a9e-b733-a44cf268a2e1\") " Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.416934 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86631848-2a82-4a9e-b733-a44cf268a2e1-utilities\") pod \"86631848-2a82-4a9e-b733-a44cf268a2e1\" (UID: \"86631848-2a82-4a9e-b733-a44cf268a2e1\") " Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.418410 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86631848-2a82-4a9e-b733-a44cf268a2e1-utilities" (OuterVolumeSpecName: "utilities") pod "86631848-2a82-4a9e-b733-a44cf268a2e1" (UID: "86631848-2a82-4a9e-b733-a44cf268a2e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.428420 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86631848-2a82-4a9e-b733-a44cf268a2e1-kube-api-access-7hcf9" (OuterVolumeSpecName: "kube-api-access-7hcf9") pod "86631848-2a82-4a9e-b733-a44cf268a2e1" (UID: "86631848-2a82-4a9e-b733-a44cf268a2e1"). InnerVolumeSpecName "kube-api-access-7hcf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.495923 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86631848-2a82-4a9e-b733-a44cf268a2e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86631848-2a82-4a9e-b733-a44cf268a2e1" (UID: "86631848-2a82-4a9e-b733-a44cf268a2e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.519422 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86631848-2a82-4a9e-b733-a44cf268a2e1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.519463 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86631848-2a82-4a9e-b733-a44cf268a2e1-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.519480 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hcf9\" (UniqueName: \"kubernetes.io/projected/86631848-2a82-4a9e-b733-a44cf268a2e1-kube-api-access-7hcf9\") on node \"crc\" DevicePath \"\"" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.825502 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ebd062b-7782-43d3-ab46-dc4ee8c93b12" path="/var/lib/kubelet/pods/2ebd062b-7782-43d3-ab46-dc4ee8c93b12/volumes" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.839832 4693 generic.go:334] "Generic (PLEG): container finished" podID="86631848-2a82-4a9e-b733-a44cf268a2e1" containerID="b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d" exitCode=0 Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.839882 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsr66" event={"ID":"86631848-2a82-4a9e-b733-a44cf268a2e1","Type":"ContainerDied","Data":"b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d"} Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.839924 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsr66" event={"ID":"86631848-2a82-4a9e-b733-a44cf268a2e1","Type":"ContainerDied","Data":"55125468c2aad2388f2c93a53200eb5edf506f1c4646cf0e0a3e91de0bbc1d7a"} Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.839927 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qsr66" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.839946 4693 scope.go:117] "RemoveContainer" containerID="b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.871715 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qsr66"] Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.872665 4693 scope.go:117] "RemoveContainer" containerID="3542eb3e862c51ca736d20f33d2991b7525be3b164c55687246af3810ec8a79a" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.883425 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qsr66"] Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.899318 4693 scope.go:117] "RemoveContainer" containerID="b46c3a7dca1e96621e0b30b9eebebefdf6a5b73e8b0e1966196483a4f6e6b679" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.971674 4693 scope.go:117] "RemoveContainer" containerID="b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d" Nov 25 12:33:52 crc kubenswrapper[4693]: E1125 12:33:52.973731 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d\": container with ID starting with b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d not found: ID does not exist" containerID="b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.973768 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d"} err="failed to get container status \"b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d\": rpc error: code = NotFound desc = could not find container \"b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d\": container with ID starting with b1d7e48da48f8fe322c31ed4474b3a1186817bd8ee9e13df4a2e677bc13e884d not found: ID does not exist" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.973798 4693 scope.go:117] "RemoveContainer" containerID="3542eb3e862c51ca736d20f33d2991b7525be3b164c55687246af3810ec8a79a" Nov 25 12:33:52 crc kubenswrapper[4693]: E1125 12:33:52.974400 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3542eb3e862c51ca736d20f33d2991b7525be3b164c55687246af3810ec8a79a\": container with ID starting with 3542eb3e862c51ca736d20f33d2991b7525be3b164c55687246af3810ec8a79a not found: ID does not exist" containerID="3542eb3e862c51ca736d20f33d2991b7525be3b164c55687246af3810ec8a79a" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.974448 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3542eb3e862c51ca736d20f33d2991b7525be3b164c55687246af3810ec8a79a"} err="failed to get container status \"3542eb3e862c51ca736d20f33d2991b7525be3b164c55687246af3810ec8a79a\": rpc error: code = NotFound desc = could not find container \"3542eb3e862c51ca736d20f33d2991b7525be3b164c55687246af3810ec8a79a\": container with ID starting with 3542eb3e862c51ca736d20f33d2991b7525be3b164c55687246af3810ec8a79a not found: ID does not exist" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.974480 4693 scope.go:117] "RemoveContainer" containerID="b46c3a7dca1e96621e0b30b9eebebefdf6a5b73e8b0e1966196483a4f6e6b679" Nov 25 12:33:52 crc kubenswrapper[4693]: E1125 12:33:52.974827 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b46c3a7dca1e96621e0b30b9eebebefdf6a5b73e8b0e1966196483a4f6e6b679\": container with ID starting with b46c3a7dca1e96621e0b30b9eebebefdf6a5b73e8b0e1966196483a4f6e6b679 not found: ID does not exist" containerID="b46c3a7dca1e96621e0b30b9eebebefdf6a5b73e8b0e1966196483a4f6e6b679" Nov 25 12:33:52 crc kubenswrapper[4693]: I1125 12:33:52.974876 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b46c3a7dca1e96621e0b30b9eebebefdf6a5b73e8b0e1966196483a4f6e6b679"} err="failed to get container status \"b46c3a7dca1e96621e0b30b9eebebefdf6a5b73e8b0e1966196483a4f6e6b679\": rpc error: code = NotFound desc = could not find container \"b46c3a7dca1e96621e0b30b9eebebefdf6a5b73e8b0e1966196483a4f6e6b679\": container with ID starting with b46c3a7dca1e96621e0b30b9eebebefdf6a5b73e8b0e1966196483a4f6e6b679 not found: ID does not exist" Nov 25 12:33:54 crc kubenswrapper[4693]: I1125 12:33:54.823691 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86631848-2a82-4a9e-b733-a44cf268a2e1" path="/var/lib/kubelet/pods/86631848-2a82-4a9e-b733-a44cf268a2e1/volumes" Nov 25 12:34:05 crc kubenswrapper[4693]: I1125 12:34:05.114065 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:34:05 crc kubenswrapper[4693]: I1125 12:34:05.114595 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:34:35 crc kubenswrapper[4693]: I1125 12:34:35.113654 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:34:35 crc kubenswrapper[4693]: I1125 12:34:35.114193 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:34:35 crc kubenswrapper[4693]: I1125 12:34:35.114252 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:34:35 crc kubenswrapper[4693]: I1125 12:34:35.115051 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 12:34:35 crc kubenswrapper[4693]: I1125 12:34:35.115117 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" gracePeriod=600 Nov 25 12:34:35 crc kubenswrapper[4693]: E1125 12:34:35.236631 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:34:35 crc kubenswrapper[4693]: I1125 12:34:35.258051 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" exitCode=0 Nov 25 12:34:35 crc kubenswrapper[4693]: I1125 12:34:35.258093 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349"} Nov 25 12:34:35 crc kubenswrapper[4693]: I1125 12:34:35.258125 4693 scope.go:117] "RemoveContainer" containerID="ce4776c622bc7e46d7d568ae624b5c3426e9dfd4bd443fa89113683ec10d405f" Nov 25 12:34:35 crc kubenswrapper[4693]: I1125 12:34:35.260059 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:34:35 crc kubenswrapper[4693]: E1125 12:34:35.260452 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:34:48 crc kubenswrapper[4693]: I1125 12:34:48.813210 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:34:48 crc kubenswrapper[4693]: E1125 12:34:48.813985 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:35:01 crc kubenswrapper[4693]: I1125 12:35:01.814591 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:35:01 crc kubenswrapper[4693]: E1125 12:35:01.815738 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:35:13 crc kubenswrapper[4693]: I1125 12:35:13.813645 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:35:13 crc kubenswrapper[4693]: E1125 12:35:13.814444 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:35:26 crc kubenswrapper[4693]: I1125 12:35:26.336103 4693 scope.go:117] "RemoveContainer" containerID="d4aec743d89e427e7a3053c368a5051515b48735a39d3153ddfb2a5358a94d39" Nov 25 12:35:26 crc kubenswrapper[4693]: I1125 12:35:26.363180 4693 scope.go:117] "RemoveContainer" containerID="4a68bd515c72d9e25bfbc68f09254063d96392dbe93a4e35febc03b1ad303a03" Nov 25 12:35:28 crc kubenswrapper[4693]: I1125 12:35:28.813220 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:35:28 crc kubenswrapper[4693]: E1125 12:35:28.814089 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:35:39 crc kubenswrapper[4693]: I1125 12:35:39.812784 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:35:39 crc kubenswrapper[4693]: E1125 12:35:39.813916 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:35:50 crc kubenswrapper[4693]: I1125 12:35:50.821124 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:35:50 crc kubenswrapper[4693]: E1125 12:35:50.821938 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:36:03 crc kubenswrapper[4693]: I1125 12:36:03.812656 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:36:03 crc kubenswrapper[4693]: E1125 12:36:03.813519 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:36:07 crc kubenswrapper[4693]: I1125 12:36:07.056715 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-b8msz"] Nov 25 12:36:07 crc kubenswrapper[4693]: I1125 12:36:07.067347 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-b8msz"] Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.036009 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3c6c-account-create-pd4xx"] Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.047291 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9sx2x"] Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.056075 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3c6c-account-create-pd4xx"] Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.065362 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-1059-account-create-r8kpj"] Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.073628 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-zzzmj"] Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.081144 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9sx2x"] Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.089071 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-243d-account-create-fgqqs"] Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.097637 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-1059-account-create-r8kpj"] Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.105418 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-243d-account-create-fgqqs"] Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.113727 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-zzzmj"] Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.826744 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09113a70-8e33-4573-978c-6fe86fa93b2f" path="/var/lib/kubelet/pods/09113a70-8e33-4573-978c-6fe86fa93b2f/volumes" Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.828109 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1353fe13-196b-4d9d-9217-8b3b8000f38d" path="/var/lib/kubelet/pods/1353fe13-196b-4d9d-9217-8b3b8000f38d/volumes" Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.828866 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60249003-1066-44a7-acae-8c8482813b62" path="/var/lib/kubelet/pods/60249003-1066-44a7-acae-8c8482813b62/volumes" Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.829576 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9dc298-2fa0-40fe-b98a-44653c91782a" path="/var/lib/kubelet/pods/8a9dc298-2fa0-40fe-b98a-44653c91782a/volumes" Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.830914 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3509e5-23a3-440b-8160-8409e8127a8e" path="/var/lib/kubelet/pods/9f3509e5-23a3-440b-8160-8409e8127a8e/volumes" Nov 25 12:36:08 crc kubenswrapper[4693]: I1125 12:36:08.831633 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a119ae70-86da-487a-baf9-d09d2a36d4bb" path="/var/lib/kubelet/pods/a119ae70-86da-487a-baf9-d09d2a36d4bb/volumes" Nov 25 12:36:18 crc kubenswrapper[4693]: I1125 12:36:18.813219 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:36:18 crc kubenswrapper[4693]: E1125 12:36:18.813977 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:36:26 crc kubenswrapper[4693]: I1125 12:36:26.418850 4693 scope.go:117] "RemoveContainer" containerID="e1c00ddd28fc06e8314b731ab4ee7f42f1eae04e4727d3aebdb3dbf34bf8fc2e" Nov 25 12:36:26 crc kubenswrapper[4693]: I1125 12:36:26.446577 4693 scope.go:117] "RemoveContainer" containerID="9631b19ceec3e8e86179bc5242a8c10046fece462cd98cc3cf1577a51a0d671d" Nov 25 12:36:26 crc kubenswrapper[4693]: I1125 12:36:26.497964 4693 scope.go:117] "RemoveContainer" containerID="e623597a9f52924e3ea9438a620c605afefdd96a0bfe0467469078bed0abb383" Nov 25 12:36:26 crc kubenswrapper[4693]: I1125 12:36:26.541160 4693 scope.go:117] "RemoveContainer" containerID="2ac60ef80e7c440117c6fddd90ff6e40f0560049991e62afd1f5bcc2b45180d5" Nov 25 12:36:26 crc kubenswrapper[4693]: I1125 12:36:26.582476 4693 scope.go:117] "RemoveContainer" containerID="9657f26267d390371f31a92e624cb49876ca37a1f193faa8f25187c2cef5ab7b" Nov 25 12:36:26 crc kubenswrapper[4693]: I1125 12:36:26.659315 4693 scope.go:117] "RemoveContainer" containerID="2b22364c913b0fce382f695a146c711935c6249421c293cdd3383d82eff09c73" Nov 25 12:36:31 crc kubenswrapper[4693]: I1125 12:36:31.813707 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:36:31 crc kubenswrapper[4693]: E1125 12:36:31.814659 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:36:36 crc kubenswrapper[4693]: I1125 12:36:36.424383 4693 generic.go:334] "Generic (PLEG): container finished" podID="0c125840-c37c-445e-95d9-37c74703ea85" containerID="6455ecb7623988988ae65d8c44655e5a2150e26855bffaf2d6b010a29399019a" exitCode=0 Nov 25 12:36:36 crc kubenswrapper[4693]: I1125 12:36:36.424475 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" event={"ID":"0c125840-c37c-445e-95d9-37c74703ea85","Type":"ContainerDied","Data":"6455ecb7623988988ae65d8c44655e5a2150e26855bffaf2d6b010a29399019a"} Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.050584 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-tmcg9"] Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.062213 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2wcm4"] Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.073907 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6aa6-account-create-8z45k"] Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.085514 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-b4hnc"] Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.095185 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e13d-account-create-2vllw"] Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.127275 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e13d-account-create-2vllw"] Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.136615 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2wcm4"] Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.145681 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6aa6-account-create-8z45k"] Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.155482 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-b4hnc"] Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.165488 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2aca-account-create-7xdtr"] Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.174832 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2aca-account-create-7xdtr"] Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.181929 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-tmcg9"] Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.848488 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.941098 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l98wb\" (UniqueName: \"kubernetes.io/projected/0c125840-c37c-445e-95d9-37c74703ea85-kube-api-access-l98wb\") pod \"0c125840-c37c-445e-95d9-37c74703ea85\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.941176 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-inventory\") pod \"0c125840-c37c-445e-95d9-37c74703ea85\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.941398 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-bootstrap-combined-ca-bundle\") pod \"0c125840-c37c-445e-95d9-37c74703ea85\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.941433 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-ssh-key\") pod \"0c125840-c37c-445e-95d9-37c74703ea85\" (UID: \"0c125840-c37c-445e-95d9-37c74703ea85\") " Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.948063 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0c125840-c37c-445e-95d9-37c74703ea85" (UID: "0c125840-c37c-445e-95d9-37c74703ea85"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.951932 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c125840-c37c-445e-95d9-37c74703ea85-kube-api-access-l98wb" (OuterVolumeSpecName: "kube-api-access-l98wb") pod "0c125840-c37c-445e-95d9-37c74703ea85" (UID: "0c125840-c37c-445e-95d9-37c74703ea85"). InnerVolumeSpecName "kube-api-access-l98wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.973132 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0c125840-c37c-445e-95d9-37c74703ea85" (UID: "0c125840-c37c-445e-95d9-37c74703ea85"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:36:37 crc kubenswrapper[4693]: I1125 12:36:37.975138 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-inventory" (OuterVolumeSpecName: "inventory") pod "0c125840-c37c-445e-95d9-37c74703ea85" (UID: "0c125840-c37c-445e-95d9-37c74703ea85"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.043405 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l98wb\" (UniqueName: \"kubernetes.io/projected/0c125840-c37c-445e-95d9-37c74703ea85-kube-api-access-l98wb\") on node \"crc\" DevicePath \"\"" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.043597 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.043670 4693 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.043725 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c125840-c37c-445e-95d9-37c74703ea85-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.449834 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" event={"ID":"0c125840-c37c-445e-95d9-37c74703ea85","Type":"ContainerDied","Data":"3b9ea55446bd5004d75d859f5967ef760ca07c7fdb2e73e1bcc222f0f1aed413"} Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.449873 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b9ea55446bd5004d75d859f5967ef760ca07c7fdb2e73e1bcc222f0f1aed413" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.449946 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-46z97" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525329 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2"] Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525739 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" containerName="extract-content" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525758 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" containerName="extract-content" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525781 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" containerName="extract-utilities" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525788 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" containerName="extract-utilities" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525796 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86631848-2a82-4a9e-b733-a44cf268a2e1" containerName="extract-content" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525802 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="86631848-2a82-4a9e-b733-a44cf268a2e1" containerName="extract-content" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525816 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fe51f7-8369-4689-a7fe-cf9a72784542" containerName="extract-content" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525822 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fe51f7-8369-4689-a7fe-cf9a72784542" containerName="extract-content" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525831 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" containerName="extract-utilities" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525839 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" containerName="extract-utilities" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525847 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86631848-2a82-4a9e-b733-a44cf268a2e1" containerName="extract-utilities" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525853 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="86631848-2a82-4a9e-b733-a44cf268a2e1" containerName="extract-utilities" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525865 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fe51f7-8369-4689-a7fe-cf9a72784542" containerName="extract-utilities" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525870 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fe51f7-8369-4689-a7fe-cf9a72784542" containerName="extract-utilities" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525882 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ebd062b-7782-43d3-ab46-dc4ee8c93b12" containerName="extract-utilities" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525889 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ebd062b-7782-43d3-ab46-dc4ee8c93b12" containerName="extract-utilities" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525899 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86631848-2a82-4a9e-b733-a44cf268a2e1" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525904 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="86631848-2a82-4a9e-b733-a44cf268a2e1" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525912 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c125840-c37c-445e-95d9-37c74703ea85" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525919 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c125840-c37c-445e-95d9-37c74703ea85" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525928 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ebd062b-7782-43d3-ab46-dc4ee8c93b12" containerName="extract-content" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525933 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ebd062b-7782-43d3-ab46-dc4ee8c93b12" containerName="extract-content" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525946 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ebd062b-7782-43d3-ab46-dc4ee8c93b12" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525952 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ebd062b-7782-43d3-ab46-dc4ee8c93b12" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525964 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525970 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.525983 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.525989 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.526004 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46fe51f7-8369-4689-a7fe-cf9a72784542" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.526010 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="46fe51f7-8369-4689-a7fe-cf9a72784542" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: E1125 12:36:38.526029 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" containerName="extract-content" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.526034 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" containerName="extract-content" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.526193 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ebd062b-7782-43d3-ab46-dc4ee8c93b12" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.526214 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c125840-c37c-445e-95d9-37c74703ea85" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.526225 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="46fe51f7-8369-4689-a7fe-cf9a72784542" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.526239 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0abeb214-1ea0-49cc-bcda-77f91bfcb5ef" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.526247 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="86631848-2a82-4a9e-b733-a44cf268a2e1" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.526256 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c234c3-6c63-4cb0-b7a6-a8cbba9eacd4" containerName="registry-server" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.528237 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.530848 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.531102 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.531138 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.531839 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.539659 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2"] Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.657213 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f5f268-3ead-442b-ae8f-d7e2c11a6752-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2\" (UID: \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.657282 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81f5f268-3ead-442b-ae8f-d7e2c11a6752-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2\" (UID: \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.657441 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmz6x\" (UniqueName: \"kubernetes.io/projected/81f5f268-3ead-442b-ae8f-d7e2c11a6752-kube-api-access-cmz6x\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2\" (UID: \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.759640 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f5f268-3ead-442b-ae8f-d7e2c11a6752-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2\" (UID: \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.759734 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81f5f268-3ead-442b-ae8f-d7e2c11a6752-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2\" (UID: \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.759863 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmz6x\" (UniqueName: \"kubernetes.io/projected/81f5f268-3ead-442b-ae8f-d7e2c11a6752-kube-api-access-cmz6x\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2\" (UID: \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.765776 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81f5f268-3ead-442b-ae8f-d7e2c11a6752-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2\" (UID: \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.767016 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f5f268-3ead-442b-ae8f-d7e2c11a6752-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2\" (UID: \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.781116 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmz6x\" (UniqueName: \"kubernetes.io/projected/81f5f268-3ead-442b-ae8f-d7e2c11a6752-kube-api-access-cmz6x\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2\" (UID: \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.826236 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d51f839-d354-4e68-91b1-63cdb3f3c3fa" path="/var/lib/kubelet/pods/6d51f839-d354-4e68-91b1-63cdb3f3c3fa/volumes" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.827031 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8be65a28-a340-4458-afba-068603fb0ec1" path="/var/lib/kubelet/pods/8be65a28-a340-4458-afba-068603fb0ec1/volumes" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.827787 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce6219d-3f7d-4731-a330-13d3dd9cb3c7" path="/var/lib/kubelet/pods/bce6219d-3f7d-4731-a330-13d3dd9cb3c7/volumes" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.828719 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca95ad70-7560-4759-9300-7e291663c116" path="/var/lib/kubelet/pods/ca95ad70-7560-4759-9300-7e291663c116/volumes" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.830409 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1" path="/var/lib/kubelet/pods/cdacf3ab-72e9-4ed4-95ca-a03db2c1c5a1/volumes" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.831351 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d575f9b6-439e-4e0d-b056-3dbcfa35c81d" path="/var/lib/kubelet/pods/d575f9b6-439e-4e0d-b056-3dbcfa35c81d/volumes" Nov 25 12:36:38 crc kubenswrapper[4693]: I1125 12:36:38.850952 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:36:39 crc kubenswrapper[4693]: I1125 12:36:39.381490 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2"] Nov 25 12:36:39 crc kubenswrapper[4693]: I1125 12:36:39.387596 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 12:36:39 crc kubenswrapper[4693]: I1125 12:36:39.461982 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" event={"ID":"81f5f268-3ead-442b-ae8f-d7e2c11a6752","Type":"ContainerStarted","Data":"f28a57640214d7d0dd1b70b6a418256e5f73a132e639d51801ffcca3d35b7232"} Nov 25 12:36:40 crc kubenswrapper[4693]: I1125 12:36:40.474595 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" event={"ID":"81f5f268-3ead-442b-ae8f-d7e2c11a6752","Type":"ContainerStarted","Data":"12c7dbca56683252a38ef908d19d6c13ecb2e06587fab4c6a220f1b58fd243cd"} Nov 25 12:36:40 crc kubenswrapper[4693]: I1125 12:36:40.494792 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" podStartSLOduration=1.959435251 podStartE2EDuration="2.494767186s" podCreationTimestamp="2025-11-25 12:36:38 +0000 UTC" firstStartedPulling="2025-11-25 12:36:39.387360759 +0000 UTC m=+1719.305446140" lastFinishedPulling="2025-11-25 12:36:39.922692704 +0000 UTC m=+1719.840778075" observedRunningTime="2025-11-25 12:36:40.491022159 +0000 UTC m=+1720.409107540" watchObservedRunningTime="2025-11-25 12:36:40.494767186 +0000 UTC m=+1720.412852567" Nov 25 12:36:42 crc kubenswrapper[4693]: I1125 12:36:42.038270 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-7rwc8"] Nov 25 12:36:42 crc kubenswrapper[4693]: I1125 12:36:42.048894 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-7rwc8"] Nov 25 12:36:42 crc kubenswrapper[4693]: I1125 12:36:42.813893 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:36:42 crc kubenswrapper[4693]: E1125 12:36:42.814189 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:36:42 crc kubenswrapper[4693]: I1125 12:36:42.824366 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb56062-ca4e-44f8-b5a1-af139e355d6e" path="/var/lib/kubelet/pods/0fb56062-ca4e-44f8-b5a1-af139e355d6e/volumes" Nov 25 12:36:57 crc kubenswrapper[4693]: I1125 12:36:57.812897 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:36:57 crc kubenswrapper[4693]: E1125 12:36:57.814055 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:37:12 crc kubenswrapper[4693]: I1125 12:37:12.813083 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:37:12 crc kubenswrapper[4693]: E1125 12:37:12.813795 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:37:24 crc kubenswrapper[4693]: I1125 12:37:24.813829 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:37:24 crc kubenswrapper[4693]: E1125 12:37:24.815070 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:37:26 crc kubenswrapper[4693]: I1125 12:37:26.808711 4693 scope.go:117] "RemoveContainer" containerID="0365948fea35c09bc61b609f21b4bd4fa4b216b01ad47b8bde568769ca757804" Nov 25 12:37:26 crc kubenswrapper[4693]: I1125 12:37:26.834604 4693 scope.go:117] "RemoveContainer" containerID="fa6bbeed6e3f3a7eaba951ed5b3c3ce02b321db589105d00b9ff7699b8dc6558" Nov 25 12:37:26 crc kubenswrapper[4693]: I1125 12:37:26.886814 4693 scope.go:117] "RemoveContainer" containerID="73d4401a25153a0144e65825b7545fd97acaf2e971e5530eb2ca3b649d96b144" Nov 25 12:37:26 crc kubenswrapper[4693]: I1125 12:37:26.935816 4693 scope.go:117] "RemoveContainer" containerID="133c9d55f230006c7069a4e502a0f5c448634ae0174e3da97df578cb288f2abd" Nov 25 12:37:26 crc kubenswrapper[4693]: I1125 12:37:26.990566 4693 scope.go:117] "RemoveContainer" containerID="a1a68526476f980a8923d3ed8ce10a849f044ad1b93f604a4317d2a03b195116" Nov 25 12:37:27 crc kubenswrapper[4693]: I1125 12:37:27.056517 4693 scope.go:117] "RemoveContainer" containerID="62c8ebe60a6f8e52b3547b1b87ec95eec93a20dd3b170b2c63d94d7be81c3468" Nov 25 12:37:27 crc kubenswrapper[4693]: I1125 12:37:27.081736 4693 scope.go:117] "RemoveContainer" containerID="1862438b965a6043f320ee9cc1d6fc04dedd0096a23736d0861143ad8e795db7" Nov 25 12:37:31 crc kubenswrapper[4693]: I1125 12:37:31.040681 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-k89hc"] Nov 25 12:37:31 crc kubenswrapper[4693]: I1125 12:37:31.050409 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-k89hc"] Nov 25 12:37:32 crc kubenswrapper[4693]: I1125 12:37:32.055796 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-qdmcl"] Nov 25 12:37:32 crc kubenswrapper[4693]: I1125 12:37:32.067253 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-qdmcl"] Nov 25 12:37:32 crc kubenswrapper[4693]: I1125 12:37:32.826446 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="064804d9-6da0-42a1-b0bd-9505f77588f8" path="/var/lib/kubelet/pods/064804d9-6da0-42a1-b0bd-9505f77588f8/volumes" Nov 25 12:37:32 crc kubenswrapper[4693]: I1125 12:37:32.827091 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8135cf2-4e92-4e70-9c47-f5fae388c0be" path="/var/lib/kubelet/pods/e8135cf2-4e92-4e70-9c47-f5fae388c0be/volumes" Nov 25 12:37:33 crc kubenswrapper[4693]: I1125 12:37:33.034746 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-trpsm"] Nov 25 12:37:33 crc kubenswrapper[4693]: I1125 12:37:33.045319 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-trpsm"] Nov 25 12:37:34 crc kubenswrapper[4693]: I1125 12:37:34.828737 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e953d6de-5a10-4627-8a1b-654ce6219d52" path="/var/lib/kubelet/pods/e953d6de-5a10-4627-8a1b-654ce6219d52/volumes" Nov 25 12:37:37 crc kubenswrapper[4693]: I1125 12:37:37.813335 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:37:37 crc kubenswrapper[4693]: E1125 12:37:37.813969 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:37:46 crc kubenswrapper[4693]: I1125 12:37:46.045568 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-68nxm"] Nov 25 12:37:46 crc kubenswrapper[4693]: I1125 12:37:46.055350 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-68nxm"] Nov 25 12:37:46 crc kubenswrapper[4693]: I1125 12:37:46.830122 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6340c0be-f12e-4dac-908a-480c7ed0e1e8" path="/var/lib/kubelet/pods/6340c0be-f12e-4dac-908a-480c7ed0e1e8/volumes" Nov 25 12:37:48 crc kubenswrapper[4693]: I1125 12:37:48.034191 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wqltl"] Nov 25 12:37:48 crc kubenswrapper[4693]: I1125 12:37:48.046117 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wqltl"] Nov 25 12:37:48 crc kubenswrapper[4693]: I1125 12:37:48.813166 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:37:48 crc kubenswrapper[4693]: E1125 12:37:48.813564 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:37:48 crc kubenswrapper[4693]: I1125 12:37:48.824634 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e344532-aeaf-4acf-9d1c-ebc0290e406e" path="/var/lib/kubelet/pods/7e344532-aeaf-4acf-9d1c-ebc0290e406e/volumes" Nov 25 12:37:51 crc kubenswrapper[4693]: I1125 12:37:51.026360 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-kqn9c"] Nov 25 12:37:51 crc kubenswrapper[4693]: I1125 12:37:51.035692 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-kqn9c"] Nov 25 12:37:52 crc kubenswrapper[4693]: I1125 12:37:52.824068 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd001ffc-9a83-408f-bc46-9a7cacf052c7" path="/var/lib/kubelet/pods/fd001ffc-9a83-408f-bc46-9a7cacf052c7/volumes" Nov 25 12:38:03 crc kubenswrapper[4693]: I1125 12:38:03.812501 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:38:03 crc kubenswrapper[4693]: E1125 12:38:03.813308 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:38:14 crc kubenswrapper[4693]: I1125 12:38:14.813462 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:38:14 crc kubenswrapper[4693]: E1125 12:38:14.814304 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:38:25 crc kubenswrapper[4693]: I1125 12:38:25.813184 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:38:25 crc kubenswrapper[4693]: E1125 12:38:25.814051 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:38:26 crc kubenswrapper[4693]: I1125 12:38:26.485007 4693 generic.go:334] "Generic (PLEG): container finished" podID="81f5f268-3ead-442b-ae8f-d7e2c11a6752" containerID="12c7dbca56683252a38ef908d19d6c13ecb2e06587fab4c6a220f1b58fd243cd" exitCode=0 Nov 25 12:38:26 crc kubenswrapper[4693]: I1125 12:38:26.485055 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" event={"ID":"81f5f268-3ead-442b-ae8f-d7e2c11a6752","Type":"ContainerDied","Data":"12c7dbca56683252a38ef908d19d6c13ecb2e06587fab4c6a220f1b58fd243cd"} Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.251398 4693 scope.go:117] "RemoveContainer" containerID="26492e710508f2f724b717ad10b8e86c2276539896a9d034e87babd3e7345c30" Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.294830 4693 scope.go:117] "RemoveContainer" containerID="8f7f05217b5ef487dfd9d9d2732365c15c6ba795b96a06a0e8ae5807fc0afe5e" Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.340770 4693 scope.go:117] "RemoveContainer" containerID="fd1b4bd19e829de777b80ab83cf74a05573309a57924b5a358ccc1f4d874ed0f" Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.378295 4693 scope.go:117] "RemoveContainer" containerID="a54d415c0cd66def5d5567c9cf0aac4246321420e6dd522d9247dcb5e9c9be6c" Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.435082 4693 scope.go:117] "RemoveContainer" containerID="065a856844c25d7a4c0e2ee8e0d95238d6749fa92e4591e83774bc825588066c" Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.470538 4693 scope.go:117] "RemoveContainer" containerID="f27fcc7bf6f370050ccc53520f9ced274f34d5f7fa6e8acf1c73bc6da3c1a826" Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.824746 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.946046 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f5f268-3ead-442b-ae8f-d7e2c11a6752-inventory\") pod \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\" (UID: \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\") " Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.946481 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81f5f268-3ead-442b-ae8f-d7e2c11a6752-ssh-key\") pod \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\" (UID: \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\") " Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.946530 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmz6x\" (UniqueName: \"kubernetes.io/projected/81f5f268-3ead-442b-ae8f-d7e2c11a6752-kube-api-access-cmz6x\") pod \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\" (UID: \"81f5f268-3ead-442b-ae8f-d7e2c11a6752\") " Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.953804 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f5f268-3ead-442b-ae8f-d7e2c11a6752-kube-api-access-cmz6x" (OuterVolumeSpecName: "kube-api-access-cmz6x") pod "81f5f268-3ead-442b-ae8f-d7e2c11a6752" (UID: "81f5f268-3ead-442b-ae8f-d7e2c11a6752"). InnerVolumeSpecName "kube-api-access-cmz6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.975090 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f5f268-3ead-442b-ae8f-d7e2c11a6752-inventory" (OuterVolumeSpecName: "inventory") pod "81f5f268-3ead-442b-ae8f-d7e2c11a6752" (UID: "81f5f268-3ead-442b-ae8f-d7e2c11a6752"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:38:27 crc kubenswrapper[4693]: I1125 12:38:27.980204 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81f5f268-3ead-442b-ae8f-d7e2c11a6752-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "81f5f268-3ead-442b-ae8f-d7e2c11a6752" (UID: "81f5f268-3ead-442b-ae8f-d7e2c11a6752"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.048732 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81f5f268-3ead-442b-ae8f-d7e2c11a6752-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.048768 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81f5f268-3ead-442b-ae8f-d7e2c11a6752-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.048777 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmz6x\" (UniqueName: \"kubernetes.io/projected/81f5f268-3ead-442b-ae8f-d7e2c11a6752-kube-api-access-cmz6x\") on node \"crc\" DevicePath \"\"" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.509748 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" event={"ID":"81f5f268-3ead-442b-ae8f-d7e2c11a6752","Type":"ContainerDied","Data":"f28a57640214d7d0dd1b70b6a418256e5f73a132e639d51801ffcca3d35b7232"} Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.509794 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f28a57640214d7d0dd1b70b6a418256e5f73a132e639d51801ffcca3d35b7232" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.510993 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.596727 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr"] Nov 25 12:38:28 crc kubenswrapper[4693]: E1125 12:38:28.597804 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f5f268-3ead-442b-ae8f-d7e2c11a6752" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.597909 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f5f268-3ead-442b-ae8f-d7e2c11a6752" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.598321 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f5f268-3ead-442b-ae8f-d7e2c11a6752" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.599434 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.602819 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.602947 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.603211 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.603489 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.606207 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr"] Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.662045 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn2fn\" (UniqueName: \"kubernetes.io/projected/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-kube-api-access-pn2fn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qfskr\" (UID: \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.662463 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qfskr\" (UID: \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.662567 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qfskr\" (UID: \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.764489 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn2fn\" (UniqueName: \"kubernetes.io/projected/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-kube-api-access-pn2fn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qfskr\" (UID: \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.764555 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qfskr\" (UID: \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.764652 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qfskr\" (UID: \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.771543 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qfskr\" (UID: \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.786094 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qfskr\" (UID: \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.787684 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn2fn\" (UniqueName: \"kubernetes.io/projected/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-kube-api-access-pn2fn\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qfskr\" (UID: \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:38:28 crc kubenswrapper[4693]: I1125 12:38:28.926261 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:38:29 crc kubenswrapper[4693]: I1125 12:38:29.436308 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr"] Nov 25 12:38:29 crc kubenswrapper[4693]: I1125 12:38:29.519545 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" event={"ID":"216fd77e-1bfd-4c99-8fd8-2711d9de6beb","Type":"ContainerStarted","Data":"ac7717703eebd7f8e7962435eb8405f67d014bd6bd416088feffacebcca8589f"} Nov 25 12:38:31 crc kubenswrapper[4693]: I1125 12:38:31.051157 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-t8c6z"] Nov 25 12:38:31 crc kubenswrapper[4693]: I1125 12:38:31.061694 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-t8c6z"] Nov 25 12:38:31 crc kubenswrapper[4693]: I1125 12:38:31.538352 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" event={"ID":"216fd77e-1bfd-4c99-8fd8-2711d9de6beb","Type":"ContainerStarted","Data":"97ddb226ad2b58bcba13a033cdd2f282dbb088e2a10e06e34cf37ecda70fb1b4"} Nov 25 12:38:31 crc kubenswrapper[4693]: I1125 12:38:31.553160 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" podStartSLOduration=2.636535083 podStartE2EDuration="3.553142007s" podCreationTimestamp="2025-11-25 12:38:28 +0000 UTC" firstStartedPulling="2025-11-25 12:38:29.444729134 +0000 UTC m=+1829.362814515" lastFinishedPulling="2025-11-25 12:38:30.361336058 +0000 UTC m=+1830.279421439" observedRunningTime="2025-11-25 12:38:31.551923935 +0000 UTC m=+1831.470009316" watchObservedRunningTime="2025-11-25 12:38:31.553142007 +0000 UTC m=+1831.471227388" Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.044151 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-k5vzm"] Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.067831 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5f13-account-create-s2vtk"] Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.087464 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7f2a-account-create-gfrxd"] Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.099104 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-k5vzm"] Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.108934 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e58d-account-create-wxbnx"] Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.118280 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5f13-account-create-s2vtk"] Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.129784 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-f5pb7"] Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.141978 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-f5pb7"] Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.155724 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e58d-account-create-wxbnx"] Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.165314 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7f2a-account-create-gfrxd"] Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.827793 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e34f200-34da-479a-a41c-f831d9e94220" path="/var/lib/kubelet/pods/5e34f200-34da-479a-a41c-f831d9e94220/volumes" Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.828507 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d1b45a-575e-42f4-b82a-cd8771e650e1" path="/var/lib/kubelet/pods/63d1b45a-575e-42f4-b82a-cd8771e650e1/volumes" Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.829096 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75cab84e-e6f2-4b22-b67b-092223f7bc87" path="/var/lib/kubelet/pods/75cab84e-e6f2-4b22-b67b-092223f7bc87/volumes" Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.829886 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76f22823-54fe-41b3-8918-1f9920948635" path="/var/lib/kubelet/pods/76f22823-54fe-41b3-8918-1f9920948635/volumes" Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.831147 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f09b87-30d2-48ef-8b6b-13c25206a68d" path="/var/lib/kubelet/pods/93f09b87-30d2-48ef-8b6b-13c25206a68d/volumes" Nov 25 12:38:32 crc kubenswrapper[4693]: I1125 12:38:32.831865 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e055aa-e90a-4be0-a519-a9b30151eaa3" path="/var/lib/kubelet/pods/a5e055aa-e90a-4be0-a519-a9b30151eaa3/volumes" Nov 25 12:38:40 crc kubenswrapper[4693]: I1125 12:38:40.813103 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:38:40 crc kubenswrapper[4693]: E1125 12:38:40.814056 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:38:54 crc kubenswrapper[4693]: I1125 12:38:54.813996 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:38:54 crc kubenswrapper[4693]: E1125 12:38:54.814864 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:39:04 crc kubenswrapper[4693]: I1125 12:39:04.045901 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mwsdc"] Nov 25 12:39:04 crc kubenswrapper[4693]: I1125 12:39:04.058526 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mwsdc"] Nov 25 12:39:04 crc kubenswrapper[4693]: I1125 12:39:04.834509 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49348558-30c7-450b-978b-0be5ea427c08" path="/var/lib/kubelet/pods/49348558-30c7-450b-978b-0be5ea427c08/volumes" Nov 25 12:39:05 crc kubenswrapper[4693]: I1125 12:39:05.813025 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:39:05 crc kubenswrapper[4693]: E1125 12:39:05.813441 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:39:16 crc kubenswrapper[4693]: I1125 12:39:16.814413 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:39:16 crc kubenswrapper[4693]: E1125 12:39:16.822361 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:39:27 crc kubenswrapper[4693]: I1125 12:39:27.680985 4693 scope.go:117] "RemoveContainer" containerID="bdac18db2389d15ad40ef97c53b99d8c43a79125d4358735adc97afe440a32dc" Nov 25 12:39:27 crc kubenswrapper[4693]: I1125 12:39:27.708511 4693 scope.go:117] "RemoveContainer" containerID="87bc75025aae6c2df3afa013ecdd94fc884d4fca980e665e3058c7afd5fac2f8" Nov 25 12:39:27 crc kubenswrapper[4693]: I1125 12:39:27.784688 4693 scope.go:117] "RemoveContainer" containerID="e0c68fe472628b7a794d21897e8e4c8227fdf442c53de6c71d4e21eff02a3c4f" Nov 25 12:39:27 crc kubenswrapper[4693]: I1125 12:39:27.813472 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:39:27 crc kubenswrapper[4693]: E1125 12:39:27.813897 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:39:27 crc kubenswrapper[4693]: I1125 12:39:27.831229 4693 scope.go:117] "RemoveContainer" containerID="f1ea9325b99eeacb6a7f039e3e6dc0dae978ebf7fd1d76b721a21451ad440e6f" Nov 25 12:39:27 crc kubenswrapper[4693]: I1125 12:39:27.855939 4693 scope.go:117] "RemoveContainer" containerID="391ad313f567f214ba07a3a03bba6df3dbfa996993de563ca58116a3db0968c9" Nov 25 12:39:27 crc kubenswrapper[4693]: I1125 12:39:27.914030 4693 scope.go:117] "RemoveContainer" containerID="641d4049a9d40f0c4affb2418df3cae80ff72148b4f727aaf6f6ebef7779c0fd" Nov 25 12:39:27 crc kubenswrapper[4693]: I1125 12:39:27.958524 4693 scope.go:117] "RemoveContainer" containerID="6c754fe65bd363484ffe743c05170e5a95e072704c87acdb666925d53eee2b22" Nov 25 12:39:28 crc kubenswrapper[4693]: I1125 12:39:28.046299 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-kfp5p"] Nov 25 12:39:28 crc kubenswrapper[4693]: I1125 12:39:28.057543 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-kfp5p"] Nov 25 12:39:28 crc kubenswrapper[4693]: I1125 12:39:28.826209 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="505f634a-96dc-4bab-9cf6-416ca6ebf3df" path="/var/lib/kubelet/pods/505f634a-96dc-4bab-9cf6-416ca6ebf3df/volumes" Nov 25 12:39:30 crc kubenswrapper[4693]: I1125 12:39:30.029811 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-z88lk"] Nov 25 12:39:30 crc kubenswrapper[4693]: I1125 12:39:30.039596 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-z88lk"] Nov 25 12:39:30 crc kubenswrapper[4693]: I1125 12:39:30.829929 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a868736f-308b-4590-966c-f4d01a5da39a" path="/var/lib/kubelet/pods/a868736f-308b-4590-966c-f4d01a5da39a/volumes" Nov 25 12:39:42 crc kubenswrapper[4693]: I1125 12:39:42.813560 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:39:43 crc kubenswrapper[4693]: I1125 12:39:43.187036 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"be2aba682d9474189f085318cf98009d0155e53628b3d276f5e0ba4c49edb9d9"} Nov 25 12:39:45 crc kubenswrapper[4693]: I1125 12:39:45.204835 4693 generic.go:334] "Generic (PLEG): container finished" podID="216fd77e-1bfd-4c99-8fd8-2711d9de6beb" containerID="97ddb226ad2b58bcba13a033cdd2f282dbb088e2a10e06e34cf37ecda70fb1b4" exitCode=0 Nov 25 12:39:45 crc kubenswrapper[4693]: I1125 12:39:45.205021 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" event={"ID":"216fd77e-1bfd-4c99-8fd8-2711d9de6beb","Type":"ContainerDied","Data":"97ddb226ad2b58bcba13a033cdd2f282dbb088e2a10e06e34cf37ecda70fb1b4"} Nov 25 12:39:46 crc kubenswrapper[4693]: I1125 12:39:46.638156 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:39:46 crc kubenswrapper[4693]: I1125 12:39:46.815563 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-inventory\") pod \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\" (UID: \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\") " Nov 25 12:39:46 crc kubenswrapper[4693]: I1125 12:39:46.815713 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn2fn\" (UniqueName: \"kubernetes.io/projected/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-kube-api-access-pn2fn\") pod \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\" (UID: \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\") " Nov 25 12:39:46 crc kubenswrapper[4693]: I1125 12:39:46.815817 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-ssh-key\") pod \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\" (UID: \"216fd77e-1bfd-4c99-8fd8-2711d9de6beb\") " Nov 25 12:39:46 crc kubenswrapper[4693]: I1125 12:39:46.822243 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-kube-api-access-pn2fn" (OuterVolumeSpecName: "kube-api-access-pn2fn") pod "216fd77e-1bfd-4c99-8fd8-2711d9de6beb" (UID: "216fd77e-1bfd-4c99-8fd8-2711d9de6beb"). InnerVolumeSpecName "kube-api-access-pn2fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:39:46 crc kubenswrapper[4693]: I1125 12:39:46.844688 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "216fd77e-1bfd-4c99-8fd8-2711d9de6beb" (UID: "216fd77e-1bfd-4c99-8fd8-2711d9de6beb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:39:46 crc kubenswrapper[4693]: I1125 12:39:46.852701 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-inventory" (OuterVolumeSpecName: "inventory") pod "216fd77e-1bfd-4c99-8fd8-2711d9de6beb" (UID: "216fd77e-1bfd-4c99-8fd8-2711d9de6beb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:39:46 crc kubenswrapper[4693]: I1125 12:39:46.918916 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:39:46 crc kubenswrapper[4693]: I1125 12:39:46.919033 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:39:46 crc kubenswrapper[4693]: I1125 12:39:46.919063 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn2fn\" (UniqueName: \"kubernetes.io/projected/216fd77e-1bfd-4c99-8fd8-2711d9de6beb-kube-api-access-pn2fn\") on node \"crc\" DevicePath \"\"" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.229838 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" event={"ID":"216fd77e-1bfd-4c99-8fd8-2711d9de6beb","Type":"ContainerDied","Data":"ac7717703eebd7f8e7962435eb8405f67d014bd6bd416088feffacebcca8589f"} Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.229891 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac7717703eebd7f8e7962435eb8405f67d014bd6bd416088feffacebcca8589f" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.229929 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qfskr" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.318760 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv"] Nov 25 12:39:47 crc kubenswrapper[4693]: E1125 12:39:47.319217 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="216fd77e-1bfd-4c99-8fd8-2711d9de6beb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.319237 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="216fd77e-1bfd-4c99-8fd8-2711d9de6beb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.319452 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="216fd77e-1bfd-4c99-8fd8-2711d9de6beb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.320146 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.324300 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.326317 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.326370 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.326397 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.333331 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv"] Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.427933 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qljjg\" (UniqueName: \"kubernetes.io/projected/6afe8ee4-7d98-4751-a224-b99437561d70-kube-api-access-qljjg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv\" (UID: \"6afe8ee4-7d98-4751-a224-b99437561d70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.428008 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6afe8ee4-7d98-4751-a224-b99437561d70-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv\" (UID: \"6afe8ee4-7d98-4751-a224-b99437561d70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.428100 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6afe8ee4-7d98-4751-a224-b99437561d70-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv\" (UID: \"6afe8ee4-7d98-4751-a224-b99437561d70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.531098 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6afe8ee4-7d98-4751-a224-b99437561d70-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv\" (UID: \"6afe8ee4-7d98-4751-a224-b99437561d70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.531572 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6afe8ee4-7d98-4751-a224-b99437561d70-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv\" (UID: \"6afe8ee4-7d98-4751-a224-b99437561d70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.532091 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qljjg\" (UniqueName: \"kubernetes.io/projected/6afe8ee4-7d98-4751-a224-b99437561d70-kube-api-access-qljjg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv\" (UID: \"6afe8ee4-7d98-4751-a224-b99437561d70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.537112 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6afe8ee4-7d98-4751-a224-b99437561d70-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv\" (UID: \"6afe8ee4-7d98-4751-a224-b99437561d70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.541712 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6afe8ee4-7d98-4751-a224-b99437561d70-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv\" (UID: \"6afe8ee4-7d98-4751-a224-b99437561d70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.560661 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qljjg\" (UniqueName: \"kubernetes.io/projected/6afe8ee4-7d98-4751-a224-b99437561d70-kube-api-access-qljjg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv\" (UID: \"6afe8ee4-7d98-4751-a224-b99437561d70\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:47 crc kubenswrapper[4693]: I1125 12:39:47.649141 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:48 crc kubenswrapper[4693]: I1125 12:39:48.153332 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv"] Nov 25 12:39:48 crc kubenswrapper[4693]: I1125 12:39:48.238937 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" event={"ID":"6afe8ee4-7d98-4751-a224-b99437561d70","Type":"ContainerStarted","Data":"6cc021e445b694ada1d373b61cb178fc0a889800691d53af97cdfa2394aeab97"} Nov 25 12:39:49 crc kubenswrapper[4693]: I1125 12:39:49.247550 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" event={"ID":"6afe8ee4-7d98-4751-a224-b99437561d70","Type":"ContainerStarted","Data":"cfa861fc689d8fdf22ac82fdea3a4f2c0827a9fe2e85c17197ede6d7254efd1b"} Nov 25 12:39:49 crc kubenswrapper[4693]: I1125 12:39:49.280158 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" podStartSLOduration=1.66603641 podStartE2EDuration="2.280137557s" podCreationTimestamp="2025-11-25 12:39:47 +0000 UTC" firstStartedPulling="2025-11-25 12:39:48.16045522 +0000 UTC m=+1908.078540601" lastFinishedPulling="2025-11-25 12:39:48.774556367 +0000 UTC m=+1908.692641748" observedRunningTime="2025-11-25 12:39:49.270220698 +0000 UTC m=+1909.188306099" watchObservedRunningTime="2025-11-25 12:39:49.280137557 +0000 UTC m=+1909.198222938" Nov 25 12:39:54 crc kubenswrapper[4693]: I1125 12:39:54.297451 4693 generic.go:334] "Generic (PLEG): container finished" podID="6afe8ee4-7d98-4751-a224-b99437561d70" containerID="cfa861fc689d8fdf22ac82fdea3a4f2c0827a9fe2e85c17197ede6d7254efd1b" exitCode=0 Nov 25 12:39:54 crc kubenswrapper[4693]: I1125 12:39:54.297527 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" event={"ID":"6afe8ee4-7d98-4751-a224-b99437561d70","Type":"ContainerDied","Data":"cfa861fc689d8fdf22ac82fdea3a4f2c0827a9fe2e85c17197ede6d7254efd1b"} Nov 25 12:39:55 crc kubenswrapper[4693]: I1125 12:39:55.708459 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:55 crc kubenswrapper[4693]: I1125 12:39:55.894233 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qljjg\" (UniqueName: \"kubernetes.io/projected/6afe8ee4-7d98-4751-a224-b99437561d70-kube-api-access-qljjg\") pod \"6afe8ee4-7d98-4751-a224-b99437561d70\" (UID: \"6afe8ee4-7d98-4751-a224-b99437561d70\") " Nov 25 12:39:55 crc kubenswrapper[4693]: I1125 12:39:55.894278 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6afe8ee4-7d98-4751-a224-b99437561d70-ssh-key\") pod \"6afe8ee4-7d98-4751-a224-b99437561d70\" (UID: \"6afe8ee4-7d98-4751-a224-b99437561d70\") " Nov 25 12:39:55 crc kubenswrapper[4693]: I1125 12:39:55.894533 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6afe8ee4-7d98-4751-a224-b99437561d70-inventory\") pod \"6afe8ee4-7d98-4751-a224-b99437561d70\" (UID: \"6afe8ee4-7d98-4751-a224-b99437561d70\") " Nov 25 12:39:55 crc kubenswrapper[4693]: I1125 12:39:55.900962 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6afe8ee4-7d98-4751-a224-b99437561d70-kube-api-access-qljjg" (OuterVolumeSpecName: "kube-api-access-qljjg") pod "6afe8ee4-7d98-4751-a224-b99437561d70" (UID: "6afe8ee4-7d98-4751-a224-b99437561d70"). InnerVolumeSpecName "kube-api-access-qljjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:39:55 crc kubenswrapper[4693]: I1125 12:39:55.923732 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afe8ee4-7d98-4751-a224-b99437561d70-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6afe8ee4-7d98-4751-a224-b99437561d70" (UID: "6afe8ee4-7d98-4751-a224-b99437561d70"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:39:55 crc kubenswrapper[4693]: I1125 12:39:55.929629 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6afe8ee4-7d98-4751-a224-b99437561d70-inventory" (OuterVolumeSpecName: "inventory") pod "6afe8ee4-7d98-4751-a224-b99437561d70" (UID: "6afe8ee4-7d98-4751-a224-b99437561d70"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:39:55 crc kubenswrapper[4693]: I1125 12:39:55.996623 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6afe8ee4-7d98-4751-a224-b99437561d70-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:39:55 crc kubenswrapper[4693]: I1125 12:39:55.996927 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qljjg\" (UniqueName: \"kubernetes.io/projected/6afe8ee4-7d98-4751-a224-b99437561d70-kube-api-access-qljjg\") on node \"crc\" DevicePath \"\"" Nov 25 12:39:55 crc kubenswrapper[4693]: I1125 12:39:55.996937 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6afe8ee4-7d98-4751-a224-b99437561d70-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.317775 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" event={"ID":"6afe8ee4-7d98-4751-a224-b99437561d70","Type":"ContainerDied","Data":"6cc021e445b694ada1d373b61cb178fc0a889800691d53af97cdfa2394aeab97"} Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.317825 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cc021e445b694ada1d373b61cb178fc0a889800691d53af97cdfa2394aeab97" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.317837 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.451133 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g"] Nov 25 12:39:56 crc kubenswrapper[4693]: E1125 12:39:56.451509 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6afe8ee4-7d98-4751-a224-b99437561d70" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.451525 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6afe8ee4-7d98-4751-a224-b99437561d70" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.451715 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6afe8ee4-7d98-4751-a224-b99437561d70" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.452314 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.454302 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.454499 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.454572 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.459757 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.465034 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g"] Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.606751 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c79f30-0e24-4101-8632-19de1642f7e2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvl2g\" (UID: \"c0c79f30-0e24-4101-8632-19de1642f7e2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.606816 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0c79f30-0e24-4101-8632-19de1642f7e2-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvl2g\" (UID: \"c0c79f30-0e24-4101-8632-19de1642f7e2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.606892 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxh7n\" (UniqueName: \"kubernetes.io/projected/c0c79f30-0e24-4101-8632-19de1642f7e2-kube-api-access-fxh7n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvl2g\" (UID: \"c0c79f30-0e24-4101-8632-19de1642f7e2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.708809 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0c79f30-0e24-4101-8632-19de1642f7e2-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvl2g\" (UID: \"c0c79f30-0e24-4101-8632-19de1642f7e2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.708871 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxh7n\" (UniqueName: \"kubernetes.io/projected/c0c79f30-0e24-4101-8632-19de1642f7e2-kube-api-access-fxh7n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvl2g\" (UID: \"c0c79f30-0e24-4101-8632-19de1642f7e2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.708989 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c79f30-0e24-4101-8632-19de1642f7e2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvl2g\" (UID: \"c0c79f30-0e24-4101-8632-19de1642f7e2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.713480 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c79f30-0e24-4101-8632-19de1642f7e2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvl2g\" (UID: \"c0c79f30-0e24-4101-8632-19de1642f7e2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.713823 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0c79f30-0e24-4101-8632-19de1642f7e2-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvl2g\" (UID: \"c0c79f30-0e24-4101-8632-19de1642f7e2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.730991 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxh7n\" (UniqueName: \"kubernetes.io/projected/c0c79f30-0e24-4101-8632-19de1642f7e2-kube-api-access-fxh7n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-kvl2g\" (UID: \"c0c79f30-0e24-4101-8632-19de1642f7e2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:39:56 crc kubenswrapper[4693]: I1125 12:39:56.780180 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:39:57 crc kubenswrapper[4693]: I1125 12:39:57.291250 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g"] Nov 25 12:39:57 crc kubenswrapper[4693]: W1125 12:39:57.296584 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0c79f30_0e24_4101_8632_19de1642f7e2.slice/crio-fe552a21f0388734d89837e5083673321cd035b11e9bddddb1d88ce1bfa44491 WatchSource:0}: Error finding container fe552a21f0388734d89837e5083673321cd035b11e9bddddb1d88ce1bfa44491: Status 404 returned error can't find the container with id fe552a21f0388734d89837e5083673321cd035b11e9bddddb1d88ce1bfa44491 Nov 25 12:39:57 crc kubenswrapper[4693]: I1125 12:39:57.327067 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" event={"ID":"c0c79f30-0e24-4101-8632-19de1642f7e2","Type":"ContainerStarted","Data":"fe552a21f0388734d89837e5083673321cd035b11e9bddddb1d88ce1bfa44491"} Nov 25 12:39:58 crc kubenswrapper[4693]: I1125 12:39:58.335868 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" event={"ID":"c0c79f30-0e24-4101-8632-19de1642f7e2","Type":"ContainerStarted","Data":"f46149d5dadb5592f9884618205d9a1a92d6092053680aa38af6ad851e2d5200"} Nov 25 12:40:12 crc kubenswrapper[4693]: I1125 12:40:12.044465 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" podStartSLOduration=15.586878054 podStartE2EDuration="16.044444552s" podCreationTimestamp="2025-11-25 12:39:56 +0000 UTC" firstStartedPulling="2025-11-25 12:39:57.298175413 +0000 UTC m=+1917.216260794" lastFinishedPulling="2025-11-25 12:39:57.755741911 +0000 UTC m=+1917.673827292" observedRunningTime="2025-11-25 12:39:58.350111155 +0000 UTC m=+1918.268196536" watchObservedRunningTime="2025-11-25 12:40:12.044444552 +0000 UTC m=+1931.962529933" Nov 25 12:40:12 crc kubenswrapper[4693]: I1125 12:40:12.046644 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-gs7bm"] Nov 25 12:40:12 crc kubenswrapper[4693]: I1125 12:40:12.055587 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-gs7bm"] Nov 25 12:40:12 crc kubenswrapper[4693]: I1125 12:40:12.826029 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6259bc4-d44b-4b7e-8228-2cfa55f87da8" path="/var/lib/kubelet/pods/d6259bc4-d44b-4b7e-8228-2cfa55f87da8/volumes" Nov 25 12:40:28 crc kubenswrapper[4693]: I1125 12:40:28.138529 4693 scope.go:117] "RemoveContainer" containerID="26b3a27127a7b5b1d1071a03e51623f71ae97f28dbd4bb717c5511160467647e" Nov 25 12:40:28 crc kubenswrapper[4693]: I1125 12:40:28.188287 4693 scope.go:117] "RemoveContainer" containerID="ab450c1bca8d4252882da7e80641646f1e20abfff53a33a0fd9b53a68b96f151" Nov 25 12:40:28 crc kubenswrapper[4693]: I1125 12:40:28.235408 4693 scope.go:117] "RemoveContainer" containerID="fdcd0d5de05a34f8c953892764b755589f78335d9849c66857802595412bd4a2" Nov 25 12:40:39 crc kubenswrapper[4693]: I1125 12:40:39.703788 4693 generic.go:334] "Generic (PLEG): container finished" podID="c0c79f30-0e24-4101-8632-19de1642f7e2" containerID="f46149d5dadb5592f9884618205d9a1a92d6092053680aa38af6ad851e2d5200" exitCode=0 Nov 25 12:40:39 crc kubenswrapper[4693]: I1125 12:40:39.703893 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" event={"ID":"c0c79f30-0e24-4101-8632-19de1642f7e2","Type":"ContainerDied","Data":"f46149d5dadb5592f9884618205d9a1a92d6092053680aa38af6ad851e2d5200"} Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.133692 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.224153 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c79f30-0e24-4101-8632-19de1642f7e2-inventory\") pod \"c0c79f30-0e24-4101-8632-19de1642f7e2\" (UID: \"c0c79f30-0e24-4101-8632-19de1642f7e2\") " Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.224685 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxh7n\" (UniqueName: \"kubernetes.io/projected/c0c79f30-0e24-4101-8632-19de1642f7e2-kube-api-access-fxh7n\") pod \"c0c79f30-0e24-4101-8632-19de1642f7e2\" (UID: \"c0c79f30-0e24-4101-8632-19de1642f7e2\") " Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.224796 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0c79f30-0e24-4101-8632-19de1642f7e2-ssh-key\") pod \"c0c79f30-0e24-4101-8632-19de1642f7e2\" (UID: \"c0c79f30-0e24-4101-8632-19de1642f7e2\") " Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.230907 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c79f30-0e24-4101-8632-19de1642f7e2-kube-api-access-fxh7n" (OuterVolumeSpecName: "kube-api-access-fxh7n") pod "c0c79f30-0e24-4101-8632-19de1642f7e2" (UID: "c0c79f30-0e24-4101-8632-19de1642f7e2"). InnerVolumeSpecName "kube-api-access-fxh7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.257392 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c79f30-0e24-4101-8632-19de1642f7e2-inventory" (OuterVolumeSpecName: "inventory") pod "c0c79f30-0e24-4101-8632-19de1642f7e2" (UID: "c0c79f30-0e24-4101-8632-19de1642f7e2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.265217 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c79f30-0e24-4101-8632-19de1642f7e2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c0c79f30-0e24-4101-8632-19de1642f7e2" (UID: "c0c79f30-0e24-4101-8632-19de1642f7e2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.325953 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxh7n\" (UniqueName: \"kubernetes.io/projected/c0c79f30-0e24-4101-8632-19de1642f7e2-kube-api-access-fxh7n\") on node \"crc\" DevicePath \"\"" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.326006 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c0c79f30-0e24-4101-8632-19de1642f7e2-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.326017 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c79f30-0e24-4101-8632-19de1642f7e2-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.735909 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" event={"ID":"c0c79f30-0e24-4101-8632-19de1642f7e2","Type":"ContainerDied","Data":"fe552a21f0388734d89837e5083673321cd035b11e9bddddb1d88ce1bfa44491"} Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.735965 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe552a21f0388734d89837e5083673321cd035b11e9bddddb1d88ce1bfa44491" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.736041 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-kvl2g" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.822278 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd"] Nov 25 12:40:41 crc kubenswrapper[4693]: E1125 12:40:41.823078 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c79f30-0e24-4101-8632-19de1642f7e2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.823109 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c79f30-0e24-4101-8632-19de1642f7e2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.823366 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c79f30-0e24-4101-8632-19de1642f7e2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.824205 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.826893 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.827498 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.828070 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.832000 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.834366 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9667d434-5214-4754-baea-bcc266b58358-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd\" (UID: \"9667d434-5214-4754-baea-bcc266b58358\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.834439 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdhvs\" (UniqueName: \"kubernetes.io/projected/9667d434-5214-4754-baea-bcc266b58358-kube-api-access-cdhvs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd\" (UID: \"9667d434-5214-4754-baea-bcc266b58358\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.834629 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9667d434-5214-4754-baea-bcc266b58358-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd\" (UID: \"9667d434-5214-4754-baea-bcc266b58358\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.840386 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd"] Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.936395 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9667d434-5214-4754-baea-bcc266b58358-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd\" (UID: \"9667d434-5214-4754-baea-bcc266b58358\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.936710 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9667d434-5214-4754-baea-bcc266b58358-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd\" (UID: \"9667d434-5214-4754-baea-bcc266b58358\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.936804 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdhvs\" (UniqueName: \"kubernetes.io/projected/9667d434-5214-4754-baea-bcc266b58358-kube-api-access-cdhvs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd\" (UID: \"9667d434-5214-4754-baea-bcc266b58358\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.941096 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9667d434-5214-4754-baea-bcc266b58358-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd\" (UID: \"9667d434-5214-4754-baea-bcc266b58358\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.941163 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9667d434-5214-4754-baea-bcc266b58358-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd\" (UID: \"9667d434-5214-4754-baea-bcc266b58358\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:40:41 crc kubenswrapper[4693]: I1125 12:40:41.953805 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdhvs\" (UniqueName: \"kubernetes.io/projected/9667d434-5214-4754-baea-bcc266b58358-kube-api-access-cdhvs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd\" (UID: \"9667d434-5214-4754-baea-bcc266b58358\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:40:42 crc kubenswrapper[4693]: I1125 12:40:42.147241 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:40:42 crc kubenswrapper[4693]: I1125 12:40:42.640209 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd"] Nov 25 12:40:42 crc kubenswrapper[4693]: I1125 12:40:42.745557 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" event={"ID":"9667d434-5214-4754-baea-bcc266b58358","Type":"ContainerStarted","Data":"b8e2e16f2f054b4da50f112f009cd3409d4ccc7bedfd874c19abfbc364f23abf"} Nov 25 12:40:43 crc kubenswrapper[4693]: I1125 12:40:43.758449 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" event={"ID":"9667d434-5214-4754-baea-bcc266b58358","Type":"ContainerStarted","Data":"38f953a66906264fb220eadbbf931c899d0d8e38e215e96e75b1aba79820c2a1"} Nov 25 12:40:43 crc kubenswrapper[4693]: I1125 12:40:43.812786 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" podStartSLOduration=2.401624689 podStartE2EDuration="2.812763693s" podCreationTimestamp="2025-11-25 12:40:41 +0000 UTC" firstStartedPulling="2025-11-25 12:40:42.638557783 +0000 UTC m=+1962.556643164" lastFinishedPulling="2025-11-25 12:40:43.049696797 +0000 UTC m=+1962.967782168" observedRunningTime="2025-11-25 12:40:43.799413668 +0000 UTC m=+1963.717499049" watchObservedRunningTime="2025-11-25 12:40:43.812763693 +0000 UTC m=+1963.730849084" Nov 25 12:41:40 crc kubenswrapper[4693]: I1125 12:41:40.350711 4693 generic.go:334] "Generic (PLEG): container finished" podID="9667d434-5214-4754-baea-bcc266b58358" containerID="38f953a66906264fb220eadbbf931c899d0d8e38e215e96e75b1aba79820c2a1" exitCode=0 Nov 25 12:41:40 crc kubenswrapper[4693]: I1125 12:41:40.350834 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" event={"ID":"9667d434-5214-4754-baea-bcc266b58358","Type":"ContainerDied","Data":"38f953a66906264fb220eadbbf931c899d0d8e38e215e96e75b1aba79820c2a1"} Nov 25 12:41:41 crc kubenswrapper[4693]: I1125 12:41:41.787039 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:41:41 crc kubenswrapper[4693]: I1125 12:41:41.916632 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9667d434-5214-4754-baea-bcc266b58358-ssh-key\") pod \"9667d434-5214-4754-baea-bcc266b58358\" (UID: \"9667d434-5214-4754-baea-bcc266b58358\") " Nov 25 12:41:41 crc kubenswrapper[4693]: I1125 12:41:41.916680 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdhvs\" (UniqueName: \"kubernetes.io/projected/9667d434-5214-4754-baea-bcc266b58358-kube-api-access-cdhvs\") pod \"9667d434-5214-4754-baea-bcc266b58358\" (UID: \"9667d434-5214-4754-baea-bcc266b58358\") " Nov 25 12:41:41 crc kubenswrapper[4693]: I1125 12:41:41.916737 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9667d434-5214-4754-baea-bcc266b58358-inventory\") pod \"9667d434-5214-4754-baea-bcc266b58358\" (UID: \"9667d434-5214-4754-baea-bcc266b58358\") " Nov 25 12:41:41 crc kubenswrapper[4693]: I1125 12:41:41.922862 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9667d434-5214-4754-baea-bcc266b58358-kube-api-access-cdhvs" (OuterVolumeSpecName: "kube-api-access-cdhvs") pod "9667d434-5214-4754-baea-bcc266b58358" (UID: "9667d434-5214-4754-baea-bcc266b58358"). InnerVolumeSpecName "kube-api-access-cdhvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:41:41 crc kubenswrapper[4693]: I1125 12:41:41.954426 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9667d434-5214-4754-baea-bcc266b58358-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9667d434-5214-4754-baea-bcc266b58358" (UID: "9667d434-5214-4754-baea-bcc266b58358"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:41:41 crc kubenswrapper[4693]: I1125 12:41:41.955727 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9667d434-5214-4754-baea-bcc266b58358-inventory" (OuterVolumeSpecName: "inventory") pod "9667d434-5214-4754-baea-bcc266b58358" (UID: "9667d434-5214-4754-baea-bcc266b58358"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.018503 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9667d434-5214-4754-baea-bcc266b58358-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.018752 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdhvs\" (UniqueName: \"kubernetes.io/projected/9667d434-5214-4754-baea-bcc266b58358-kube-api-access-cdhvs\") on node \"crc\" DevicePath \"\"" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.018801 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9667d434-5214-4754-baea-bcc266b58358-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.369318 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" event={"ID":"9667d434-5214-4754-baea-bcc266b58358","Type":"ContainerDied","Data":"b8e2e16f2f054b4da50f112f009cd3409d4ccc7bedfd874c19abfbc364f23abf"} Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.369363 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8e2e16f2f054b4da50f112f009cd3409d4ccc7bedfd874c19abfbc364f23abf" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.369432 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.450994 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cx4gc"] Nov 25 12:41:42 crc kubenswrapper[4693]: E1125 12:41:42.451421 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9667d434-5214-4754-baea-bcc266b58358" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.451445 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9667d434-5214-4754-baea-bcc266b58358" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.451597 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9667d434-5214-4754-baea-bcc266b58358" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.452909 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.455811 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.455997 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.456009 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.456584 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.474880 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cx4gc"] Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.630500 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgzf8\" (UniqueName: \"kubernetes.io/projected/3200a40a-dfa0-40f7-a79d-054de8e9e386-kube-api-access-rgzf8\") pod \"ssh-known-hosts-edpm-deployment-cx4gc\" (UID: \"3200a40a-dfa0-40f7-a79d-054de8e9e386\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.630605 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3200a40a-dfa0-40f7-a79d-054de8e9e386-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cx4gc\" (UID: \"3200a40a-dfa0-40f7-a79d-054de8e9e386\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.630650 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3200a40a-dfa0-40f7-a79d-054de8e9e386-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cx4gc\" (UID: \"3200a40a-dfa0-40f7-a79d-054de8e9e386\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.731987 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3200a40a-dfa0-40f7-a79d-054de8e9e386-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cx4gc\" (UID: \"3200a40a-dfa0-40f7-a79d-054de8e9e386\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.732095 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgzf8\" (UniqueName: \"kubernetes.io/projected/3200a40a-dfa0-40f7-a79d-054de8e9e386-kube-api-access-rgzf8\") pod \"ssh-known-hosts-edpm-deployment-cx4gc\" (UID: \"3200a40a-dfa0-40f7-a79d-054de8e9e386\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.732160 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3200a40a-dfa0-40f7-a79d-054de8e9e386-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cx4gc\" (UID: \"3200a40a-dfa0-40f7-a79d-054de8e9e386\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.737069 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3200a40a-dfa0-40f7-a79d-054de8e9e386-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cx4gc\" (UID: \"3200a40a-dfa0-40f7-a79d-054de8e9e386\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.737081 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3200a40a-dfa0-40f7-a79d-054de8e9e386-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cx4gc\" (UID: \"3200a40a-dfa0-40f7-a79d-054de8e9e386\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.749875 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgzf8\" (UniqueName: \"kubernetes.io/projected/3200a40a-dfa0-40f7-a79d-054de8e9e386-kube-api-access-rgzf8\") pod \"ssh-known-hosts-edpm-deployment-cx4gc\" (UID: \"3200a40a-dfa0-40f7-a79d-054de8e9e386\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:42 crc kubenswrapper[4693]: I1125 12:41:42.775252 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:43 crc kubenswrapper[4693]: I1125 12:41:43.355690 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cx4gc"] Nov 25 12:41:43 crc kubenswrapper[4693]: I1125 12:41:43.357617 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 12:41:43 crc kubenswrapper[4693]: I1125 12:41:43.379345 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" event={"ID":"3200a40a-dfa0-40f7-a79d-054de8e9e386","Type":"ContainerStarted","Data":"62b6092c925626af458f7bd97146a36e5a070090fc22d716e316e6dbbadef091"} Nov 25 12:41:44 crc kubenswrapper[4693]: I1125 12:41:44.389455 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" event={"ID":"3200a40a-dfa0-40f7-a79d-054de8e9e386","Type":"ContainerStarted","Data":"49ebb57b1ed8bbf60a6cc4b08a4b150804d2a0067a4433d8586d9cdf01b242fb"} Nov 25 12:41:44 crc kubenswrapper[4693]: I1125 12:41:44.412764 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" podStartSLOduration=1.593300905 podStartE2EDuration="2.41274137s" podCreationTimestamp="2025-11-25 12:41:42 +0000 UTC" firstStartedPulling="2025-11-25 12:41:43.35739631 +0000 UTC m=+2023.275481691" lastFinishedPulling="2025-11-25 12:41:44.176836775 +0000 UTC m=+2024.094922156" observedRunningTime="2025-11-25 12:41:44.402311052 +0000 UTC m=+2024.320396443" watchObservedRunningTime="2025-11-25 12:41:44.41274137 +0000 UTC m=+2024.330826751" Nov 25 12:41:51 crc kubenswrapper[4693]: I1125 12:41:51.451909 4693 generic.go:334] "Generic (PLEG): container finished" podID="3200a40a-dfa0-40f7-a79d-054de8e9e386" containerID="49ebb57b1ed8bbf60a6cc4b08a4b150804d2a0067a4433d8586d9cdf01b242fb" exitCode=0 Nov 25 12:41:51 crc kubenswrapper[4693]: I1125 12:41:51.451997 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" event={"ID":"3200a40a-dfa0-40f7-a79d-054de8e9e386","Type":"ContainerDied","Data":"49ebb57b1ed8bbf60a6cc4b08a4b150804d2a0067a4433d8586d9cdf01b242fb"} Nov 25 12:41:52 crc kubenswrapper[4693]: I1125 12:41:52.891665 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:52 crc kubenswrapper[4693]: I1125 12:41:52.937083 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgzf8\" (UniqueName: \"kubernetes.io/projected/3200a40a-dfa0-40f7-a79d-054de8e9e386-kube-api-access-rgzf8\") pod \"3200a40a-dfa0-40f7-a79d-054de8e9e386\" (UID: \"3200a40a-dfa0-40f7-a79d-054de8e9e386\") " Nov 25 12:41:52 crc kubenswrapper[4693]: I1125 12:41:52.937157 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3200a40a-dfa0-40f7-a79d-054de8e9e386-ssh-key-openstack-edpm-ipam\") pod \"3200a40a-dfa0-40f7-a79d-054de8e9e386\" (UID: \"3200a40a-dfa0-40f7-a79d-054de8e9e386\") " Nov 25 12:41:52 crc kubenswrapper[4693]: I1125 12:41:52.937222 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3200a40a-dfa0-40f7-a79d-054de8e9e386-inventory-0\") pod \"3200a40a-dfa0-40f7-a79d-054de8e9e386\" (UID: \"3200a40a-dfa0-40f7-a79d-054de8e9e386\") " Nov 25 12:41:52 crc kubenswrapper[4693]: I1125 12:41:52.945542 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3200a40a-dfa0-40f7-a79d-054de8e9e386-kube-api-access-rgzf8" (OuterVolumeSpecName: "kube-api-access-rgzf8") pod "3200a40a-dfa0-40f7-a79d-054de8e9e386" (UID: "3200a40a-dfa0-40f7-a79d-054de8e9e386"). InnerVolumeSpecName "kube-api-access-rgzf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:41:52 crc kubenswrapper[4693]: I1125 12:41:52.986554 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3200a40a-dfa0-40f7-a79d-054de8e9e386-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3200a40a-dfa0-40f7-a79d-054de8e9e386" (UID: "3200a40a-dfa0-40f7-a79d-054de8e9e386"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.016659 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3200a40a-dfa0-40f7-a79d-054de8e9e386-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3200a40a-dfa0-40f7-a79d-054de8e9e386" (UID: "3200a40a-dfa0-40f7-a79d-054de8e9e386"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.038606 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3200a40a-dfa0-40f7-a79d-054de8e9e386-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.038640 4693 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3200a40a-dfa0-40f7-a79d-054de8e9e386-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.038651 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgzf8\" (UniqueName: \"kubernetes.io/projected/3200a40a-dfa0-40f7-a79d-054de8e9e386-kube-api-access-rgzf8\") on node \"crc\" DevicePath \"\"" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.487223 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" event={"ID":"3200a40a-dfa0-40f7-a79d-054de8e9e386","Type":"ContainerDied","Data":"62b6092c925626af458f7bd97146a36e5a070090fc22d716e316e6dbbadef091"} Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.487553 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62b6092c925626af458f7bd97146a36e5a070090fc22d716e316e6dbbadef091" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.487638 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cx4gc" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.547750 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd"] Nov 25 12:41:53 crc kubenswrapper[4693]: E1125 12:41:53.548229 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3200a40a-dfa0-40f7-a79d-054de8e9e386" containerName="ssh-known-hosts-edpm-deployment" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.548249 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="3200a40a-dfa0-40f7-a79d-054de8e9e386" containerName="ssh-known-hosts-edpm-deployment" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.548471 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="3200a40a-dfa0-40f7-a79d-054de8e9e386" containerName="ssh-known-hosts-edpm-deployment" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.549087 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.550799 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.551064 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.551126 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.551361 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.562118 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vwwwd\" (UID: \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.562516 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz2cz\" (UniqueName: \"kubernetes.io/projected/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-kube-api-access-wz2cz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vwwwd\" (UID: \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.562767 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vwwwd\" (UID: \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.567211 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd"] Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.666523 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vwwwd\" (UID: \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.666612 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz2cz\" (UniqueName: \"kubernetes.io/projected/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-kube-api-access-wz2cz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vwwwd\" (UID: \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.666746 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vwwwd\" (UID: \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.671470 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vwwwd\" (UID: \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.671880 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vwwwd\" (UID: \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.682065 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz2cz\" (UniqueName: \"kubernetes.io/projected/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-kube-api-access-wz2cz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vwwwd\" (UID: \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:41:53 crc kubenswrapper[4693]: I1125 12:41:53.870555 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:41:54 crc kubenswrapper[4693]: I1125 12:41:54.417477 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd"] Nov 25 12:41:54 crc kubenswrapper[4693]: I1125 12:41:54.498239 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" event={"ID":"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46","Type":"ContainerStarted","Data":"ba91c84ec358ca8a75331f1d6913982a1910aae7f577ba9b26464324b4d10399"} Nov 25 12:41:55 crc kubenswrapper[4693]: I1125 12:41:55.509467 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" event={"ID":"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46","Type":"ContainerStarted","Data":"5d9b3adde468935efbc415c929158810f043b5c9c998defc521478ceb37cd19c"} Nov 25 12:42:04 crc kubenswrapper[4693]: I1125 12:42:04.593518 4693 generic.go:334] "Generic (PLEG): container finished" podID="a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46" containerID="5d9b3adde468935efbc415c929158810f043b5c9c998defc521478ceb37cd19c" exitCode=0 Nov 25 12:42:04 crc kubenswrapper[4693]: I1125 12:42:04.594119 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" event={"ID":"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46","Type":"ContainerDied","Data":"5d9b3adde468935efbc415c929158810f043b5c9c998defc521478ceb37cd19c"} Nov 25 12:42:05 crc kubenswrapper[4693]: I1125 12:42:05.113652 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:42:05 crc kubenswrapper[4693]: I1125 12:42:05.114214 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.008788 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.101492 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz2cz\" (UniqueName: \"kubernetes.io/projected/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-kube-api-access-wz2cz\") pod \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\" (UID: \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\") " Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.101627 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-ssh-key\") pod \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\" (UID: \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\") " Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.101722 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-inventory\") pod \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\" (UID: \"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46\") " Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.107057 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-kube-api-access-wz2cz" (OuterVolumeSpecName: "kube-api-access-wz2cz") pod "a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46" (UID: "a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46"). InnerVolumeSpecName "kube-api-access-wz2cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.132002 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46" (UID: "a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.133277 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-inventory" (OuterVolumeSpecName: "inventory") pod "a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46" (UID: "a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.204281 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz2cz\" (UniqueName: \"kubernetes.io/projected/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-kube-api-access-wz2cz\") on node \"crc\" DevicePath \"\"" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.204312 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.204321 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.612935 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" event={"ID":"a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46","Type":"ContainerDied","Data":"ba91c84ec358ca8a75331f1d6913982a1910aae7f577ba9b26464324b4d10399"} Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.613205 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba91c84ec358ca8a75331f1d6913982a1910aae7f577ba9b26464324b4d10399" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.612998 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vwwwd" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.716025 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7"] Nov 25 12:42:06 crc kubenswrapper[4693]: E1125 12:42:06.716628 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.716642 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.717038 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.717870 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.728636 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7"] Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.749232 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.749471 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.751844 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.752997 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.818086 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/622af4c3-4b56-4b3c-8ea2-6d30432a706a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7\" (UID: \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.818231 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/622af4c3-4b56-4b3c-8ea2-6d30432a706a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7\" (UID: \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.818284 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kncdp\" (UniqueName: \"kubernetes.io/projected/622af4c3-4b56-4b3c-8ea2-6d30432a706a-kube-api-access-kncdp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7\" (UID: \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.920137 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/622af4c3-4b56-4b3c-8ea2-6d30432a706a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7\" (UID: \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.920205 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/622af4c3-4b56-4b3c-8ea2-6d30432a706a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7\" (UID: \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.920234 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kncdp\" (UniqueName: \"kubernetes.io/projected/622af4c3-4b56-4b3c-8ea2-6d30432a706a-kube-api-access-kncdp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7\" (UID: \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.924667 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/622af4c3-4b56-4b3c-8ea2-6d30432a706a-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7\" (UID: \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.925772 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/622af4c3-4b56-4b3c-8ea2-6d30432a706a-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7\" (UID: \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:06 crc kubenswrapper[4693]: I1125 12:42:06.938473 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kncdp\" (UniqueName: \"kubernetes.io/projected/622af4c3-4b56-4b3c-8ea2-6d30432a706a-kube-api-access-kncdp\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7\" (UID: \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:07 crc kubenswrapper[4693]: I1125 12:42:07.070944 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:07 crc kubenswrapper[4693]: I1125 12:42:07.575633 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7"] Nov 25 12:42:07 crc kubenswrapper[4693]: I1125 12:42:07.625871 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" event={"ID":"622af4c3-4b56-4b3c-8ea2-6d30432a706a","Type":"ContainerStarted","Data":"04d5394a2145b4ffba56f5aaef9eb65e9a948fc432a5b39b66ce27523428ca34"} Nov 25 12:42:08 crc kubenswrapper[4693]: I1125 12:42:08.636508 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" event={"ID":"622af4c3-4b56-4b3c-8ea2-6d30432a706a","Type":"ContainerStarted","Data":"c98e0fac2175dca8775f3419bebe183554f1f406b29a4a74f8cbc61384693a3c"} Nov 25 12:42:08 crc kubenswrapper[4693]: I1125 12:42:08.659735 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" podStartSLOduration=2.198481759 podStartE2EDuration="2.659715906s" podCreationTimestamp="2025-11-25 12:42:06 +0000 UTC" firstStartedPulling="2025-11-25 12:42:07.579699101 +0000 UTC m=+2047.497784482" lastFinishedPulling="2025-11-25 12:42:08.040933238 +0000 UTC m=+2047.959018629" observedRunningTime="2025-11-25 12:42:08.649899725 +0000 UTC m=+2048.567985126" watchObservedRunningTime="2025-11-25 12:42:08.659715906 +0000 UTC m=+2048.577801287" Nov 25 12:42:18 crc kubenswrapper[4693]: I1125 12:42:18.731791 4693 generic.go:334] "Generic (PLEG): container finished" podID="622af4c3-4b56-4b3c-8ea2-6d30432a706a" containerID="c98e0fac2175dca8775f3419bebe183554f1f406b29a4a74f8cbc61384693a3c" exitCode=0 Nov 25 12:42:18 crc kubenswrapper[4693]: I1125 12:42:18.731838 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" event={"ID":"622af4c3-4b56-4b3c-8ea2-6d30432a706a","Type":"ContainerDied","Data":"c98e0fac2175dca8775f3419bebe183554f1f406b29a4a74f8cbc61384693a3c"} Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.179660 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.291188 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/622af4c3-4b56-4b3c-8ea2-6d30432a706a-inventory\") pod \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\" (UID: \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\") " Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.291535 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/622af4c3-4b56-4b3c-8ea2-6d30432a706a-ssh-key\") pod \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\" (UID: \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\") " Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.291619 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kncdp\" (UniqueName: \"kubernetes.io/projected/622af4c3-4b56-4b3c-8ea2-6d30432a706a-kube-api-access-kncdp\") pod \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\" (UID: \"622af4c3-4b56-4b3c-8ea2-6d30432a706a\") " Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.296914 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/622af4c3-4b56-4b3c-8ea2-6d30432a706a-kube-api-access-kncdp" (OuterVolumeSpecName: "kube-api-access-kncdp") pod "622af4c3-4b56-4b3c-8ea2-6d30432a706a" (UID: "622af4c3-4b56-4b3c-8ea2-6d30432a706a"). InnerVolumeSpecName "kube-api-access-kncdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.324535 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/622af4c3-4b56-4b3c-8ea2-6d30432a706a-inventory" (OuterVolumeSpecName: "inventory") pod "622af4c3-4b56-4b3c-8ea2-6d30432a706a" (UID: "622af4c3-4b56-4b3c-8ea2-6d30432a706a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.325087 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/622af4c3-4b56-4b3c-8ea2-6d30432a706a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "622af4c3-4b56-4b3c-8ea2-6d30432a706a" (UID: "622af4c3-4b56-4b3c-8ea2-6d30432a706a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.394874 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/622af4c3-4b56-4b3c-8ea2-6d30432a706a-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.394912 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/622af4c3-4b56-4b3c-8ea2-6d30432a706a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.394929 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kncdp\" (UniqueName: \"kubernetes.io/projected/622af4c3-4b56-4b3c-8ea2-6d30432a706a-kube-api-access-kncdp\") on node \"crc\" DevicePath \"\"" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.753057 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" event={"ID":"622af4c3-4b56-4b3c-8ea2-6d30432a706a","Type":"ContainerDied","Data":"04d5394a2145b4ffba56f5aaef9eb65e9a948fc432a5b39b66ce27523428ca34"} Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.753094 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04d5394a2145b4ffba56f5aaef9eb65e9a948fc432a5b39b66ce27523428ca34" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.753145 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.869119 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr"] Nov 25 12:42:20 crc kubenswrapper[4693]: E1125 12:42:20.869678 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="622af4c3-4b56-4b3c-8ea2-6d30432a706a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.869712 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="622af4c3-4b56-4b3c-8ea2-6d30432a706a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.870018 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="622af4c3-4b56-4b3c-8ea2-6d30432a706a" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.870857 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.874165 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.876050 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.876353 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.876437 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.876792 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.877115 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.878195 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.879915 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr"] Nov 25 12:42:20 crc kubenswrapper[4693]: I1125 12:42:20.916446 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.017856 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018254 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018281 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018302 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018330 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018393 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018414 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzsv7\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-kube-api-access-zzsv7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018444 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018507 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018527 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018595 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018621 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018654 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.018673 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.119835 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.119926 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.119949 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.119988 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.120042 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.120061 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.120079 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.120105 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.120135 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.120152 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzsv7\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-kube-api-access-zzsv7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.120179 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.120210 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.120229 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.120260 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.125696 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.125975 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.126745 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.126985 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.127069 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.128415 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.129655 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.129759 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.130637 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.131355 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.132792 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.133269 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.139164 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.151623 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzsv7\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-kube-api-access-zzsv7\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.237138 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:42:21 crc kubenswrapper[4693]: I1125 12:42:21.776412 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr"] Nov 25 12:42:22 crc kubenswrapper[4693]: I1125 12:42:22.788767 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" event={"ID":"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f","Type":"ContainerStarted","Data":"577a80b047fddbae29c4711ebd06e245eb240a4f21d5b4698052740b9f3200b5"} Nov 25 12:42:22 crc kubenswrapper[4693]: I1125 12:42:22.790139 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" event={"ID":"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f","Type":"ContainerStarted","Data":"6c7c556b94b6ab0d5147b08f66276889d144479a879c4d3270c24a3287873ce3"} Nov 25 12:42:22 crc kubenswrapper[4693]: I1125 12:42:22.815028 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" podStartSLOduration=2.415097276 podStartE2EDuration="2.815012953s" podCreationTimestamp="2025-11-25 12:42:20 +0000 UTC" firstStartedPulling="2025-11-25 12:42:21.785214443 +0000 UTC m=+2061.703299824" lastFinishedPulling="2025-11-25 12:42:22.18513012 +0000 UTC m=+2062.103215501" observedRunningTime="2025-11-25 12:42:22.805184322 +0000 UTC m=+2062.723269703" watchObservedRunningTime="2025-11-25 12:42:22.815012953 +0000 UTC m=+2062.733098334" Nov 25 12:42:35 crc kubenswrapper[4693]: I1125 12:42:35.114076 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:42:35 crc kubenswrapper[4693]: I1125 12:42:35.114723 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:43:05 crc kubenswrapper[4693]: I1125 12:43:05.113793 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:43:05 crc kubenswrapper[4693]: I1125 12:43:05.114337 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:43:05 crc kubenswrapper[4693]: I1125 12:43:05.114491 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:43:05 crc kubenswrapper[4693]: I1125 12:43:05.115315 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be2aba682d9474189f085318cf98009d0155e53628b3d276f5e0ba4c49edb9d9"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 12:43:05 crc kubenswrapper[4693]: I1125 12:43:05.115410 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://be2aba682d9474189f085318cf98009d0155e53628b3d276f5e0ba4c49edb9d9" gracePeriod=600 Nov 25 12:43:05 crc kubenswrapper[4693]: I1125 12:43:05.199002 4693 generic.go:334] "Generic (PLEG): container finished" podID="6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" containerID="577a80b047fddbae29c4711ebd06e245eb240a4f21d5b4698052740b9f3200b5" exitCode=0 Nov 25 12:43:05 crc kubenswrapper[4693]: I1125 12:43:05.199053 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" event={"ID":"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f","Type":"ContainerDied","Data":"577a80b047fddbae29c4711ebd06e245eb240a4f21d5b4698052740b9f3200b5"} Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.230479 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="be2aba682d9474189f085318cf98009d0155e53628b3d276f5e0ba4c49edb9d9" exitCode=0 Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.230598 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"be2aba682d9474189f085318cf98009d0155e53628b3d276f5e0ba4c49edb9d9"} Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.231367 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0"} Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.231482 4693 scope.go:117] "RemoveContainer" containerID="a18898526fdc5e0642b61f73d6efa795022ac19271da9dfd49faf8062cda8349" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.621262 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697065 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-repo-setup-combined-ca-bundle\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697308 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzsv7\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-kube-api-access-zzsv7\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697361 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-ovn-combined-ca-bundle\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697446 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697493 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-ovn-default-certs-0\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697547 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-telemetry-combined-ca-bundle\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697581 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-libvirt-combined-ca-bundle\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697619 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-inventory\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697671 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-nova-combined-ca-bundle\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697708 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-bootstrap-combined-ca-bundle\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697828 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697884 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.697962 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-neutron-metadata-combined-ca-bundle\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.698002 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-ssh-key\") pod \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\" (UID: \"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f\") " Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.705413 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.706237 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.706336 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-kube-api-access-zzsv7" (OuterVolumeSpecName: "kube-api-access-zzsv7") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "kube-api-access-zzsv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.706675 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.706749 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.706839 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.707216 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.707302 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.707558 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.708466 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.708877 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.709302 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.730835 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-inventory" (OuterVolumeSpecName: "inventory") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.748180 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" (UID: "6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800620 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzsv7\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-kube-api-access-zzsv7\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800644 4693 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800655 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800665 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800674 4693 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800682 4693 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800691 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800698 4693 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800705 4693 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800714 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800724 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800735 4693 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800744 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:06 crc kubenswrapper[4693]: I1125 12:43:06.800755 4693 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.241887 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.241881 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr" event={"ID":"6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f","Type":"ContainerDied","Data":"6c7c556b94b6ab0d5147b08f66276889d144479a879c4d3270c24a3287873ce3"} Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.242019 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c7c556b94b6ab0d5147b08f66276889d144479a879c4d3270c24a3287873ce3" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.328762 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k"] Nov 25 12:43:07 crc kubenswrapper[4693]: E1125 12:43:07.329206 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.329225 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.329426 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.330119 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.334350 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.334673 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.334869 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.335034 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.335223 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.341745 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k"] Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.410585 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.410655 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.410717 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.410787 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmtbf\" (UniqueName: \"kubernetes.io/projected/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-kube-api-access-pmtbf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.410854 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.512927 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.512976 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.513043 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.513087 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmtbf\" (UniqueName: \"kubernetes.io/projected/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-kube-api-access-pmtbf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.513124 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.514528 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.517255 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.522271 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.522758 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.532213 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmtbf\" (UniqueName: \"kubernetes.io/projected/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-kube-api-access-pmtbf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7ct5k\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:07 crc kubenswrapper[4693]: I1125 12:43:07.658100 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:43:08 crc kubenswrapper[4693]: I1125 12:43:08.196221 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k"] Nov 25 12:43:08 crc kubenswrapper[4693]: W1125 12:43:08.201329 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7282171_b6bf_44b4_a5a3_f60d6d5baa5f.slice/crio-ecf095d7c9516470eb704cc1d5854ccf9aaebc359bffb3ea1ef6061289c54036 WatchSource:0}: Error finding container ecf095d7c9516470eb704cc1d5854ccf9aaebc359bffb3ea1ef6061289c54036: Status 404 returned error can't find the container with id ecf095d7c9516470eb704cc1d5854ccf9aaebc359bffb3ea1ef6061289c54036 Nov 25 12:43:08 crc kubenswrapper[4693]: I1125 12:43:08.255170 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" event={"ID":"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f","Type":"ContainerStarted","Data":"ecf095d7c9516470eb704cc1d5854ccf9aaebc359bffb3ea1ef6061289c54036"} Nov 25 12:43:09 crc kubenswrapper[4693]: I1125 12:43:09.264209 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" event={"ID":"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f","Type":"ContainerStarted","Data":"f0e4be21512b246b6f26f2d6994e0e30104a4476b6405bd20f1c06675bf18663"} Nov 25 12:43:09 crc kubenswrapper[4693]: I1125 12:43:09.283106 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" podStartSLOduration=1.715155572 podStartE2EDuration="2.283086487s" podCreationTimestamp="2025-11-25 12:43:07 +0000 UTC" firstStartedPulling="2025-11-25 12:43:08.203308498 +0000 UTC m=+2108.121393879" lastFinishedPulling="2025-11-25 12:43:08.771239403 +0000 UTC m=+2108.689324794" observedRunningTime="2025-11-25 12:43:09.27980546 +0000 UTC m=+2109.197890861" watchObservedRunningTime="2025-11-25 12:43:09.283086487 +0000 UTC m=+2109.201171868" Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.149922 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l2hdl"] Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.153399 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.181390 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2hdl"] Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.275358 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8e1342-1319-4c47-9a60-a287e55cff29-catalog-content\") pod \"redhat-marketplace-l2hdl\" (UID: \"5c8e1342-1319-4c47-9a60-a287e55cff29\") " pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.275773 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6qxn\" (UniqueName: \"kubernetes.io/projected/5c8e1342-1319-4c47-9a60-a287e55cff29-kube-api-access-t6qxn\") pod \"redhat-marketplace-l2hdl\" (UID: \"5c8e1342-1319-4c47-9a60-a287e55cff29\") " pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.276541 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8e1342-1319-4c47-9a60-a287e55cff29-utilities\") pod \"redhat-marketplace-l2hdl\" (UID: \"5c8e1342-1319-4c47-9a60-a287e55cff29\") " pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.378854 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8e1342-1319-4c47-9a60-a287e55cff29-catalog-content\") pod \"redhat-marketplace-l2hdl\" (UID: \"5c8e1342-1319-4c47-9a60-a287e55cff29\") " pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.379156 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6qxn\" (UniqueName: \"kubernetes.io/projected/5c8e1342-1319-4c47-9a60-a287e55cff29-kube-api-access-t6qxn\") pod \"redhat-marketplace-l2hdl\" (UID: \"5c8e1342-1319-4c47-9a60-a287e55cff29\") " pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.379242 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8e1342-1319-4c47-9a60-a287e55cff29-utilities\") pod \"redhat-marketplace-l2hdl\" (UID: \"5c8e1342-1319-4c47-9a60-a287e55cff29\") " pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.379797 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8e1342-1319-4c47-9a60-a287e55cff29-catalog-content\") pod \"redhat-marketplace-l2hdl\" (UID: \"5c8e1342-1319-4c47-9a60-a287e55cff29\") " pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.379809 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8e1342-1319-4c47-9a60-a287e55cff29-utilities\") pod \"redhat-marketplace-l2hdl\" (UID: \"5c8e1342-1319-4c47-9a60-a287e55cff29\") " pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.399208 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6qxn\" (UniqueName: \"kubernetes.io/projected/5c8e1342-1319-4c47-9a60-a287e55cff29-kube-api-access-t6qxn\") pod \"redhat-marketplace-l2hdl\" (UID: \"5c8e1342-1319-4c47-9a60-a287e55cff29\") " pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.481124 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:33 crc kubenswrapper[4693]: I1125 12:43:33.982721 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2hdl"] Nov 25 12:43:34 crc kubenswrapper[4693]: I1125 12:43:34.498561 4693 generic.go:334] "Generic (PLEG): container finished" podID="5c8e1342-1319-4c47-9a60-a287e55cff29" containerID="c0924b5a65be1742eccf94c2148d07483c220f3648b0ea6a8ea0f167171a2241" exitCode=0 Nov 25 12:43:34 crc kubenswrapper[4693]: I1125 12:43:34.498629 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2hdl" event={"ID":"5c8e1342-1319-4c47-9a60-a287e55cff29","Type":"ContainerDied","Data":"c0924b5a65be1742eccf94c2148d07483c220f3648b0ea6a8ea0f167171a2241"} Nov 25 12:43:34 crc kubenswrapper[4693]: I1125 12:43:34.498835 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2hdl" event={"ID":"5c8e1342-1319-4c47-9a60-a287e55cff29","Type":"ContainerStarted","Data":"9334d18895bebb52e7217e30f07368c2591987b16053185bf7ac192cbc8ef612"} Nov 25 12:43:35 crc kubenswrapper[4693]: I1125 12:43:35.509845 4693 generic.go:334] "Generic (PLEG): container finished" podID="5c8e1342-1319-4c47-9a60-a287e55cff29" containerID="70561a7402b27252f85703eb8c65f53c206322b487491888cf8cc71e7fc0a6d2" exitCode=0 Nov 25 12:43:35 crc kubenswrapper[4693]: I1125 12:43:35.509896 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2hdl" event={"ID":"5c8e1342-1319-4c47-9a60-a287e55cff29","Type":"ContainerDied","Data":"70561a7402b27252f85703eb8c65f53c206322b487491888cf8cc71e7fc0a6d2"} Nov 25 12:43:36 crc kubenswrapper[4693]: I1125 12:43:36.522561 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2hdl" event={"ID":"5c8e1342-1319-4c47-9a60-a287e55cff29","Type":"ContainerStarted","Data":"7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee"} Nov 25 12:43:36 crc kubenswrapper[4693]: I1125 12:43:36.543439 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l2hdl" podStartSLOduration=2.14620063 podStartE2EDuration="3.543418063s" podCreationTimestamp="2025-11-25 12:43:33 +0000 UTC" firstStartedPulling="2025-11-25 12:43:34.500884326 +0000 UTC m=+2134.418969707" lastFinishedPulling="2025-11-25 12:43:35.898101759 +0000 UTC m=+2135.816187140" observedRunningTime="2025-11-25 12:43:36.542288573 +0000 UTC m=+2136.460373964" watchObservedRunningTime="2025-11-25 12:43:36.543418063 +0000 UTC m=+2136.461503464" Nov 25 12:43:43 crc kubenswrapper[4693]: I1125 12:43:43.481957 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:43 crc kubenswrapper[4693]: I1125 12:43:43.482548 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:43 crc kubenswrapper[4693]: I1125 12:43:43.525432 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:43 crc kubenswrapper[4693]: I1125 12:43:43.620937 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:43 crc kubenswrapper[4693]: I1125 12:43:43.756843 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2hdl"] Nov 25 12:43:45 crc kubenswrapper[4693]: I1125 12:43:45.593603 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l2hdl" podUID="5c8e1342-1319-4c47-9a60-a287e55cff29" containerName="registry-server" containerID="cri-o://7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee" gracePeriod=2 Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.041866 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.167188 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r696k"] Nov 25 12:43:46 crc kubenswrapper[4693]: E1125 12:43:46.167609 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8e1342-1319-4c47-9a60-a287e55cff29" containerName="extract-content" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.167623 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8e1342-1319-4c47-9a60-a287e55cff29" containerName="extract-content" Nov 25 12:43:46 crc kubenswrapper[4693]: E1125 12:43:46.167638 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8e1342-1319-4c47-9a60-a287e55cff29" containerName="registry-server" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.167645 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8e1342-1319-4c47-9a60-a287e55cff29" containerName="registry-server" Nov 25 12:43:46 crc kubenswrapper[4693]: E1125 12:43:46.167660 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8e1342-1319-4c47-9a60-a287e55cff29" containerName="extract-utilities" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.167667 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8e1342-1319-4c47-9a60-a287e55cff29" containerName="extract-utilities" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.167904 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8e1342-1319-4c47-9a60-a287e55cff29" containerName="registry-server" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.169178 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.186897 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r696k"] Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.239226 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6qxn\" (UniqueName: \"kubernetes.io/projected/5c8e1342-1319-4c47-9a60-a287e55cff29-kube-api-access-t6qxn\") pod \"5c8e1342-1319-4c47-9a60-a287e55cff29\" (UID: \"5c8e1342-1319-4c47-9a60-a287e55cff29\") " Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.239407 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8e1342-1319-4c47-9a60-a287e55cff29-utilities\") pod \"5c8e1342-1319-4c47-9a60-a287e55cff29\" (UID: \"5c8e1342-1319-4c47-9a60-a287e55cff29\") " Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.239452 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8e1342-1319-4c47-9a60-a287e55cff29-catalog-content\") pod \"5c8e1342-1319-4c47-9a60-a287e55cff29\" (UID: \"5c8e1342-1319-4c47-9a60-a287e55cff29\") " Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.240481 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c8e1342-1319-4c47-9a60-a287e55cff29-utilities" (OuterVolumeSpecName: "utilities") pod "5c8e1342-1319-4c47-9a60-a287e55cff29" (UID: "5c8e1342-1319-4c47-9a60-a287e55cff29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.253579 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8e1342-1319-4c47-9a60-a287e55cff29-kube-api-access-t6qxn" (OuterVolumeSpecName: "kube-api-access-t6qxn") pod "5c8e1342-1319-4c47-9a60-a287e55cff29" (UID: "5c8e1342-1319-4c47-9a60-a287e55cff29"). InnerVolumeSpecName "kube-api-access-t6qxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.257692 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c8e1342-1319-4c47-9a60-a287e55cff29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c8e1342-1319-4c47-9a60-a287e55cff29" (UID: "5c8e1342-1319-4c47-9a60-a287e55cff29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.341944 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c6134f7-536e-4aa4-9dc5-bf49e374802e-catalog-content\") pod \"redhat-operators-r696k\" (UID: \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\") " pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.342034 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c6134f7-536e-4aa4-9dc5-bf49e374802e-utilities\") pod \"redhat-operators-r696k\" (UID: \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\") " pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.342061 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57q2c\" (UniqueName: \"kubernetes.io/projected/5c6134f7-536e-4aa4-9dc5-bf49e374802e-kube-api-access-57q2c\") pod \"redhat-operators-r696k\" (UID: \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\") " pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.342234 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6qxn\" (UniqueName: \"kubernetes.io/projected/5c8e1342-1319-4c47-9a60-a287e55cff29-kube-api-access-t6qxn\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.342247 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8e1342-1319-4c47-9a60-a287e55cff29-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.342258 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8e1342-1319-4c47-9a60-a287e55cff29-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.443605 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c6134f7-536e-4aa4-9dc5-bf49e374802e-catalog-content\") pod \"redhat-operators-r696k\" (UID: \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\") " pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.443968 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c6134f7-536e-4aa4-9dc5-bf49e374802e-utilities\") pod \"redhat-operators-r696k\" (UID: \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\") " pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.444003 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57q2c\" (UniqueName: \"kubernetes.io/projected/5c6134f7-536e-4aa4-9dc5-bf49e374802e-kube-api-access-57q2c\") pod \"redhat-operators-r696k\" (UID: \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\") " pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.444101 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c6134f7-536e-4aa4-9dc5-bf49e374802e-catalog-content\") pod \"redhat-operators-r696k\" (UID: \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\") " pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.444683 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c6134f7-536e-4aa4-9dc5-bf49e374802e-utilities\") pod \"redhat-operators-r696k\" (UID: \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\") " pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.468759 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57q2c\" (UniqueName: \"kubernetes.io/projected/5c6134f7-536e-4aa4-9dc5-bf49e374802e-kube-api-access-57q2c\") pod \"redhat-operators-r696k\" (UID: \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\") " pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.485124 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.645744 4693 generic.go:334] "Generic (PLEG): container finished" podID="5c8e1342-1319-4c47-9a60-a287e55cff29" containerID="7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee" exitCode=0 Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.645815 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2hdl" event={"ID":"5c8e1342-1319-4c47-9a60-a287e55cff29","Type":"ContainerDied","Data":"7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee"} Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.645867 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2hdl" event={"ID":"5c8e1342-1319-4c47-9a60-a287e55cff29","Type":"ContainerDied","Data":"9334d18895bebb52e7217e30f07368c2591987b16053185bf7ac192cbc8ef612"} Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.645896 4693 scope.go:117] "RemoveContainer" containerID="7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.645895 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2hdl" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.673988 4693 scope.go:117] "RemoveContainer" containerID="70561a7402b27252f85703eb8c65f53c206322b487491888cf8cc71e7fc0a6d2" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.715257 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2hdl"] Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.718493 4693 scope.go:117] "RemoveContainer" containerID="c0924b5a65be1742eccf94c2148d07483c220f3648b0ea6a8ea0f167171a2241" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.732629 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2hdl"] Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.774412 4693 scope.go:117] "RemoveContainer" containerID="7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee" Nov 25 12:43:46 crc kubenswrapper[4693]: E1125 12:43:46.775838 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee\": container with ID starting with 7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee not found: ID does not exist" containerID="7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.775894 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee"} err="failed to get container status \"7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee\": rpc error: code = NotFound desc = could not find container \"7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee\": container with ID starting with 7362a95d9c67a7faf0e5db2558c861c2673d7c07cfd374ab018278cb11145bee not found: ID does not exist" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.775923 4693 scope.go:117] "RemoveContainer" containerID="70561a7402b27252f85703eb8c65f53c206322b487491888cf8cc71e7fc0a6d2" Nov 25 12:43:46 crc kubenswrapper[4693]: E1125 12:43:46.777571 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70561a7402b27252f85703eb8c65f53c206322b487491888cf8cc71e7fc0a6d2\": container with ID starting with 70561a7402b27252f85703eb8c65f53c206322b487491888cf8cc71e7fc0a6d2 not found: ID does not exist" containerID="70561a7402b27252f85703eb8c65f53c206322b487491888cf8cc71e7fc0a6d2" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.777628 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70561a7402b27252f85703eb8c65f53c206322b487491888cf8cc71e7fc0a6d2"} err="failed to get container status \"70561a7402b27252f85703eb8c65f53c206322b487491888cf8cc71e7fc0a6d2\": rpc error: code = NotFound desc = could not find container \"70561a7402b27252f85703eb8c65f53c206322b487491888cf8cc71e7fc0a6d2\": container with ID starting with 70561a7402b27252f85703eb8c65f53c206322b487491888cf8cc71e7fc0a6d2 not found: ID does not exist" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.777663 4693 scope.go:117] "RemoveContainer" containerID="c0924b5a65be1742eccf94c2148d07483c220f3648b0ea6a8ea0f167171a2241" Nov 25 12:43:46 crc kubenswrapper[4693]: E1125 12:43:46.777982 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0924b5a65be1742eccf94c2148d07483c220f3648b0ea6a8ea0f167171a2241\": container with ID starting with c0924b5a65be1742eccf94c2148d07483c220f3648b0ea6a8ea0f167171a2241 not found: ID does not exist" containerID="c0924b5a65be1742eccf94c2148d07483c220f3648b0ea6a8ea0f167171a2241" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.778013 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0924b5a65be1742eccf94c2148d07483c220f3648b0ea6a8ea0f167171a2241"} err="failed to get container status \"c0924b5a65be1742eccf94c2148d07483c220f3648b0ea6a8ea0f167171a2241\": rpc error: code = NotFound desc = could not find container \"c0924b5a65be1742eccf94c2148d07483c220f3648b0ea6a8ea0f167171a2241\": container with ID starting with c0924b5a65be1742eccf94c2148d07483c220f3648b0ea6a8ea0f167171a2241 not found: ID does not exist" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.825683 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c8e1342-1319-4c47-9a60-a287e55cff29" path="/var/lib/kubelet/pods/5c8e1342-1319-4c47-9a60-a287e55cff29/volumes" Nov 25 12:43:46 crc kubenswrapper[4693]: I1125 12:43:46.994690 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r696k"] Nov 25 12:43:47 crc kubenswrapper[4693]: I1125 12:43:47.657624 4693 generic.go:334] "Generic (PLEG): container finished" podID="5c6134f7-536e-4aa4-9dc5-bf49e374802e" containerID="700bf0f1b6243879c15a4a6bf868a5bd14cde284a4c7a7da04981b8b505931af" exitCode=0 Nov 25 12:43:47 crc kubenswrapper[4693]: I1125 12:43:47.657684 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r696k" event={"ID":"5c6134f7-536e-4aa4-9dc5-bf49e374802e","Type":"ContainerDied","Data":"700bf0f1b6243879c15a4a6bf868a5bd14cde284a4c7a7da04981b8b505931af"} Nov 25 12:43:47 crc kubenswrapper[4693]: I1125 12:43:47.657709 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r696k" event={"ID":"5c6134f7-536e-4aa4-9dc5-bf49e374802e","Type":"ContainerStarted","Data":"8f40ddb43a4a3094176eb2b77bcb4ebb1ecb64b192ce57789f957130903550fe"} Nov 25 12:43:49 crc kubenswrapper[4693]: I1125 12:43:49.683525 4693 generic.go:334] "Generic (PLEG): container finished" podID="5c6134f7-536e-4aa4-9dc5-bf49e374802e" containerID="4a3bb503fda221a75eeba5f70410eb3baf2dd553a3f0b0363e5097753cfa1383" exitCode=0 Nov 25 12:43:49 crc kubenswrapper[4693]: I1125 12:43:49.683596 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r696k" event={"ID":"5c6134f7-536e-4aa4-9dc5-bf49e374802e","Type":"ContainerDied","Data":"4a3bb503fda221a75eeba5f70410eb3baf2dd553a3f0b0363e5097753cfa1383"} Nov 25 12:43:51 crc kubenswrapper[4693]: I1125 12:43:51.706210 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r696k" event={"ID":"5c6134f7-536e-4aa4-9dc5-bf49e374802e","Type":"ContainerStarted","Data":"7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9"} Nov 25 12:43:51 crc kubenswrapper[4693]: I1125 12:43:51.732451 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r696k" podStartSLOduration=2.810577678 podStartE2EDuration="5.732427762s" podCreationTimestamp="2025-11-25 12:43:46 +0000 UTC" firstStartedPulling="2025-11-25 12:43:47.65980481 +0000 UTC m=+2147.577890191" lastFinishedPulling="2025-11-25 12:43:50.581654884 +0000 UTC m=+2150.499740275" observedRunningTime="2025-11-25 12:43:51.722502988 +0000 UTC m=+2151.640588379" watchObservedRunningTime="2025-11-25 12:43:51.732427762 +0000 UTC m=+2151.650513163" Nov 25 12:43:56 crc kubenswrapper[4693]: I1125 12:43:56.485422 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:56 crc kubenswrapper[4693]: I1125 12:43:56.486494 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:56 crc kubenswrapper[4693]: I1125 12:43:56.541590 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:56 crc kubenswrapper[4693]: I1125 12:43:56.804042 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:56 crc kubenswrapper[4693]: I1125 12:43:56.891808 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r696k"] Nov 25 12:43:58 crc kubenswrapper[4693]: I1125 12:43:58.767516 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r696k" podUID="5c6134f7-536e-4aa4-9dc5-bf49e374802e" containerName="registry-server" containerID="cri-o://7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9" gracePeriod=2 Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.270348 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.421366 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57q2c\" (UniqueName: \"kubernetes.io/projected/5c6134f7-536e-4aa4-9dc5-bf49e374802e-kube-api-access-57q2c\") pod \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\" (UID: \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\") " Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.421800 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c6134f7-536e-4aa4-9dc5-bf49e374802e-catalog-content\") pod \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\" (UID: \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\") " Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.422066 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c6134f7-536e-4aa4-9dc5-bf49e374802e-utilities\") pod \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\" (UID: \"5c6134f7-536e-4aa4-9dc5-bf49e374802e\") " Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.422812 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c6134f7-536e-4aa4-9dc5-bf49e374802e-utilities" (OuterVolumeSpecName: "utilities") pod "5c6134f7-536e-4aa4-9dc5-bf49e374802e" (UID: "5c6134f7-536e-4aa4-9dc5-bf49e374802e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.427137 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c6134f7-536e-4aa4-9dc5-bf49e374802e-kube-api-access-57q2c" (OuterVolumeSpecName: "kube-api-access-57q2c") pod "5c6134f7-536e-4aa4-9dc5-bf49e374802e" (UID: "5c6134f7-536e-4aa4-9dc5-bf49e374802e"). InnerVolumeSpecName "kube-api-access-57q2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.514464 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c6134f7-536e-4aa4-9dc5-bf49e374802e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c6134f7-536e-4aa4-9dc5-bf49e374802e" (UID: "5c6134f7-536e-4aa4-9dc5-bf49e374802e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.524917 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57q2c\" (UniqueName: \"kubernetes.io/projected/5c6134f7-536e-4aa4-9dc5-bf49e374802e-kube-api-access-57q2c\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.524957 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c6134f7-536e-4aa4-9dc5-bf49e374802e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.524970 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c6134f7-536e-4aa4-9dc5-bf49e374802e-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.777786 4693 generic.go:334] "Generic (PLEG): container finished" podID="5c6134f7-536e-4aa4-9dc5-bf49e374802e" containerID="7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9" exitCode=0 Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.777828 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r696k" event={"ID":"5c6134f7-536e-4aa4-9dc5-bf49e374802e","Type":"ContainerDied","Data":"7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9"} Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.777860 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r696k" event={"ID":"5c6134f7-536e-4aa4-9dc5-bf49e374802e","Type":"ContainerDied","Data":"8f40ddb43a4a3094176eb2b77bcb4ebb1ecb64b192ce57789f957130903550fe"} Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.777861 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r696k" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.777879 4693 scope.go:117] "RemoveContainer" containerID="7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.802638 4693 scope.go:117] "RemoveContainer" containerID="4a3bb503fda221a75eeba5f70410eb3baf2dd553a3f0b0363e5097753cfa1383" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.818656 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r696k"] Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.831238 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r696k"] Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.833201 4693 scope.go:117] "RemoveContainer" containerID="700bf0f1b6243879c15a4a6bf868a5bd14cde284a4c7a7da04981b8b505931af" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.872312 4693 scope.go:117] "RemoveContainer" containerID="7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9" Nov 25 12:43:59 crc kubenswrapper[4693]: E1125 12:43:59.872767 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9\": container with ID starting with 7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9 not found: ID does not exist" containerID="7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.872805 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9"} err="failed to get container status \"7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9\": rpc error: code = NotFound desc = could not find container \"7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9\": container with ID starting with 7994d88100913ccaab16bf798c372a26476107d4a1858fc91d5ba780d7f619d9 not found: ID does not exist" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.872825 4693 scope.go:117] "RemoveContainer" containerID="4a3bb503fda221a75eeba5f70410eb3baf2dd553a3f0b0363e5097753cfa1383" Nov 25 12:43:59 crc kubenswrapper[4693]: E1125 12:43:59.873193 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a3bb503fda221a75eeba5f70410eb3baf2dd553a3f0b0363e5097753cfa1383\": container with ID starting with 4a3bb503fda221a75eeba5f70410eb3baf2dd553a3f0b0363e5097753cfa1383 not found: ID does not exist" containerID="4a3bb503fda221a75eeba5f70410eb3baf2dd553a3f0b0363e5097753cfa1383" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.873224 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3bb503fda221a75eeba5f70410eb3baf2dd553a3f0b0363e5097753cfa1383"} err="failed to get container status \"4a3bb503fda221a75eeba5f70410eb3baf2dd553a3f0b0363e5097753cfa1383\": rpc error: code = NotFound desc = could not find container \"4a3bb503fda221a75eeba5f70410eb3baf2dd553a3f0b0363e5097753cfa1383\": container with ID starting with 4a3bb503fda221a75eeba5f70410eb3baf2dd553a3f0b0363e5097753cfa1383 not found: ID does not exist" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.873241 4693 scope.go:117] "RemoveContainer" containerID="700bf0f1b6243879c15a4a6bf868a5bd14cde284a4c7a7da04981b8b505931af" Nov 25 12:43:59 crc kubenswrapper[4693]: E1125 12:43:59.873608 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"700bf0f1b6243879c15a4a6bf868a5bd14cde284a4c7a7da04981b8b505931af\": container with ID starting with 700bf0f1b6243879c15a4a6bf868a5bd14cde284a4c7a7da04981b8b505931af not found: ID does not exist" containerID="700bf0f1b6243879c15a4a6bf868a5bd14cde284a4c7a7da04981b8b505931af" Nov 25 12:43:59 crc kubenswrapper[4693]: I1125 12:43:59.873663 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"700bf0f1b6243879c15a4a6bf868a5bd14cde284a4c7a7da04981b8b505931af"} err="failed to get container status \"700bf0f1b6243879c15a4a6bf868a5bd14cde284a4c7a7da04981b8b505931af\": rpc error: code = NotFound desc = could not find container \"700bf0f1b6243879c15a4a6bf868a5bd14cde284a4c7a7da04981b8b505931af\": container with ID starting with 700bf0f1b6243879c15a4a6bf868a5bd14cde284a4c7a7da04981b8b505931af not found: ID does not exist" Nov 25 12:44:00 crc kubenswrapper[4693]: I1125 12:44:00.823404 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c6134f7-536e-4aa4-9dc5-bf49e374802e" path="/var/lib/kubelet/pods/5c6134f7-536e-4aa4-9dc5-bf49e374802e/volumes" Nov 25 12:44:18 crc kubenswrapper[4693]: I1125 12:44:18.967550 4693 generic.go:334] "Generic (PLEG): container finished" podID="d7282171-b6bf-44b4-a5a3-f60d6d5baa5f" containerID="f0e4be21512b246b6f26f2d6994e0e30104a4476b6405bd20f1c06675bf18663" exitCode=0 Nov 25 12:44:18 crc kubenswrapper[4693]: I1125 12:44:18.967640 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" event={"ID":"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f","Type":"ContainerDied","Data":"f0e4be21512b246b6f26f2d6994e0e30104a4476b6405bd20f1c06675bf18663"} Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.447463 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.627293 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-inventory\") pod \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.628073 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ovn-combined-ca-bundle\") pod \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.628130 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ovncontroller-config-0\") pod \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.628262 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ssh-key\") pod \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.628283 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmtbf\" (UniqueName: \"kubernetes.io/projected/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-kube-api-access-pmtbf\") pod \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\" (UID: \"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f\") " Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.635581 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d7282171-b6bf-44b4-a5a3-f60d6d5baa5f" (UID: "d7282171-b6bf-44b4-a5a3-f60d6d5baa5f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.635851 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-kube-api-access-pmtbf" (OuterVolumeSpecName: "kube-api-access-pmtbf") pod "d7282171-b6bf-44b4-a5a3-f60d6d5baa5f" (UID: "d7282171-b6bf-44b4-a5a3-f60d6d5baa5f"). InnerVolumeSpecName "kube-api-access-pmtbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.652478 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "d7282171-b6bf-44b4-a5a3-f60d6d5baa5f" (UID: "d7282171-b6bf-44b4-a5a3-f60d6d5baa5f"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.658835 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d7282171-b6bf-44b4-a5a3-f60d6d5baa5f" (UID: "d7282171-b6bf-44b4-a5a3-f60d6d5baa5f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.663052 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-inventory" (OuterVolumeSpecName: "inventory") pod "d7282171-b6bf-44b4-a5a3-f60d6d5baa5f" (UID: "d7282171-b6bf-44b4-a5a3-f60d6d5baa5f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.730732 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.730777 4693 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.730793 4693 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.730805 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:44:20 crc kubenswrapper[4693]: I1125 12:44:20.730816 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmtbf\" (UniqueName: \"kubernetes.io/projected/d7282171-b6bf-44b4-a5a3-f60d6d5baa5f-kube-api-access-pmtbf\") on node \"crc\" DevicePath \"\"" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.017623 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" event={"ID":"d7282171-b6bf-44b4-a5a3-f60d6d5baa5f","Type":"ContainerDied","Data":"ecf095d7c9516470eb704cc1d5854ccf9aaebc359bffb3ea1ef6061289c54036"} Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.017956 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecf095d7c9516470eb704cc1d5854ccf9aaebc359bffb3ea1ef6061289c54036" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.017886 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7ct5k" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.080058 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz"] Nov 25 12:44:21 crc kubenswrapper[4693]: E1125 12:44:21.080601 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6134f7-536e-4aa4-9dc5-bf49e374802e" containerName="extract-content" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.080624 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6134f7-536e-4aa4-9dc5-bf49e374802e" containerName="extract-content" Nov 25 12:44:21 crc kubenswrapper[4693]: E1125 12:44:21.080638 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6134f7-536e-4aa4-9dc5-bf49e374802e" containerName="registry-server" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.080646 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6134f7-536e-4aa4-9dc5-bf49e374802e" containerName="registry-server" Nov 25 12:44:21 crc kubenswrapper[4693]: E1125 12:44:21.080678 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7282171-b6bf-44b4-a5a3-f60d6d5baa5f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.080686 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7282171-b6bf-44b4-a5a3-f60d6d5baa5f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 25 12:44:21 crc kubenswrapper[4693]: E1125 12:44:21.080704 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6134f7-536e-4aa4-9dc5-bf49e374802e" containerName="extract-utilities" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.080710 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6134f7-536e-4aa4-9dc5-bf49e374802e" containerName="extract-utilities" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.080974 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7282171-b6bf-44b4-a5a3-f60d6d5baa5f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.081007 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6134f7-536e-4aa4-9dc5-bf49e374802e" containerName="registry-server" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.081876 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.085469 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.085516 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.085573 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.085783 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.085867 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.085947 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.099399 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz"] Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.241389 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.241460 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.241679 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.241903 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpgnw\" (UniqueName: \"kubernetes.io/projected/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-kube-api-access-kpgnw\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.241947 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.242038 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.343441 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.343607 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.343773 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.344286 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.344411 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpgnw\" (UniqueName: \"kubernetes.io/projected/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-kube-api-access-kpgnw\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.344448 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.350072 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.350232 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.350781 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.351398 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.351613 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.367478 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpgnw\" (UniqueName: \"kubernetes.io/projected/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-kube-api-access-kpgnw\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.399835 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:44:21 crc kubenswrapper[4693]: I1125 12:44:21.955597 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz"] Nov 25 12:44:22 crc kubenswrapper[4693]: I1125 12:44:22.029696 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" event={"ID":"3e1334ad-6a95-4d72-95a2-5dfa8d78e530","Type":"ContainerStarted","Data":"00c1edaace01e41771f36dfaf615f787d074f39cdf2497dfc164c9cb62a35bfa"} Nov 25 12:44:23 crc kubenswrapper[4693]: I1125 12:44:23.040993 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" event={"ID":"3e1334ad-6a95-4d72-95a2-5dfa8d78e530","Type":"ContainerStarted","Data":"73334df6cbea10f2c085673135f35db684b49bcb066572456a32c17a5f92fbfb"} Nov 25 12:44:23 crc kubenswrapper[4693]: I1125 12:44:23.060562 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" podStartSLOduration=1.533282286 podStartE2EDuration="2.060540499s" podCreationTimestamp="2025-11-25 12:44:21 +0000 UTC" firstStartedPulling="2025-11-25 12:44:21.961438066 +0000 UTC m=+2181.879523457" lastFinishedPulling="2025-11-25 12:44:22.488696289 +0000 UTC m=+2182.406781670" observedRunningTime="2025-11-25 12:44:23.055613008 +0000 UTC m=+2182.973698399" watchObservedRunningTime="2025-11-25 12:44:23.060540499 +0000 UTC m=+2182.978625880" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.161517 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f"] Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.163578 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.166148 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.166148 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.188221 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f"] Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.299936 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxn25\" (UniqueName: \"kubernetes.io/projected/cb9e0867-f6aa-43ca-b148-a2850b65ab16-kube-api-access-wxn25\") pod \"collect-profiles-29401245-nsf4f\" (UID: \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.300281 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb9e0867-f6aa-43ca-b148-a2850b65ab16-secret-volume\") pod \"collect-profiles-29401245-nsf4f\" (UID: \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.300425 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb9e0867-f6aa-43ca-b148-a2850b65ab16-config-volume\") pod \"collect-profiles-29401245-nsf4f\" (UID: \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.402952 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb9e0867-f6aa-43ca-b148-a2850b65ab16-secret-volume\") pod \"collect-profiles-29401245-nsf4f\" (UID: \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.403075 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb9e0867-f6aa-43ca-b148-a2850b65ab16-config-volume\") pod \"collect-profiles-29401245-nsf4f\" (UID: \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.403444 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxn25\" (UniqueName: \"kubernetes.io/projected/cb9e0867-f6aa-43ca-b148-a2850b65ab16-kube-api-access-wxn25\") pod \"collect-profiles-29401245-nsf4f\" (UID: \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.404564 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb9e0867-f6aa-43ca-b148-a2850b65ab16-config-volume\") pod \"collect-profiles-29401245-nsf4f\" (UID: \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.417975 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb9e0867-f6aa-43ca-b148-a2850b65ab16-secret-volume\") pod \"collect-profiles-29401245-nsf4f\" (UID: \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.422979 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxn25\" (UniqueName: \"kubernetes.io/projected/cb9e0867-f6aa-43ca-b148-a2850b65ab16-kube-api-access-wxn25\") pod \"collect-profiles-29401245-nsf4f\" (UID: \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.494211 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:00 crc kubenswrapper[4693]: I1125 12:45:00.963844 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f"] Nov 25 12:45:01 crc kubenswrapper[4693]: I1125 12:45:01.375903 4693 generic.go:334] "Generic (PLEG): container finished" podID="cb9e0867-f6aa-43ca-b148-a2850b65ab16" containerID="6613c4a5f4af603943c67a036b74f91f493345743ab9fbef9a13e2acb5a0ad0c" exitCode=0 Nov 25 12:45:01 crc kubenswrapper[4693]: I1125 12:45:01.375970 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" event={"ID":"cb9e0867-f6aa-43ca-b148-a2850b65ab16","Type":"ContainerDied","Data":"6613c4a5f4af603943c67a036b74f91f493345743ab9fbef9a13e2acb5a0ad0c"} Nov 25 12:45:01 crc kubenswrapper[4693]: I1125 12:45:01.376263 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" event={"ID":"cb9e0867-f6aa-43ca-b148-a2850b65ab16","Type":"ContainerStarted","Data":"c6620eb02c7802159f78327cab72109548c506f7f2de67b782a9ffe274f019b3"} Nov 25 12:45:02 crc kubenswrapper[4693]: I1125 12:45:02.756018 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:02 crc kubenswrapper[4693]: I1125 12:45:02.851478 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxn25\" (UniqueName: \"kubernetes.io/projected/cb9e0867-f6aa-43ca-b148-a2850b65ab16-kube-api-access-wxn25\") pod \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\" (UID: \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\") " Nov 25 12:45:02 crc kubenswrapper[4693]: I1125 12:45:02.851537 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb9e0867-f6aa-43ca-b148-a2850b65ab16-secret-volume\") pod \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\" (UID: \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\") " Nov 25 12:45:02 crc kubenswrapper[4693]: I1125 12:45:02.851698 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb9e0867-f6aa-43ca-b148-a2850b65ab16-config-volume\") pod \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\" (UID: \"cb9e0867-f6aa-43ca-b148-a2850b65ab16\") " Nov 25 12:45:02 crc kubenswrapper[4693]: I1125 12:45:02.852237 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb9e0867-f6aa-43ca-b148-a2850b65ab16-config-volume" (OuterVolumeSpecName: "config-volume") pod "cb9e0867-f6aa-43ca-b148-a2850b65ab16" (UID: "cb9e0867-f6aa-43ca-b148-a2850b65ab16"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:45:02 crc kubenswrapper[4693]: I1125 12:45:02.857852 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9e0867-f6aa-43ca-b148-a2850b65ab16-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cb9e0867-f6aa-43ca-b148-a2850b65ab16" (UID: "cb9e0867-f6aa-43ca-b148-a2850b65ab16"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:45:02 crc kubenswrapper[4693]: I1125 12:45:02.858218 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9e0867-f6aa-43ca-b148-a2850b65ab16-kube-api-access-wxn25" (OuterVolumeSpecName: "kube-api-access-wxn25") pod "cb9e0867-f6aa-43ca-b148-a2850b65ab16" (UID: "cb9e0867-f6aa-43ca-b148-a2850b65ab16"). InnerVolumeSpecName "kube-api-access-wxn25". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:45:02 crc kubenswrapper[4693]: I1125 12:45:02.954362 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb9e0867-f6aa-43ca-b148-a2850b65ab16-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 12:45:02 crc kubenswrapper[4693]: I1125 12:45:02.954419 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxn25\" (UniqueName: \"kubernetes.io/projected/cb9e0867-f6aa-43ca-b148-a2850b65ab16-kube-api-access-wxn25\") on node \"crc\" DevicePath \"\"" Nov 25 12:45:02 crc kubenswrapper[4693]: I1125 12:45:02.954453 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb9e0867-f6aa-43ca-b148-a2850b65ab16-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 12:45:03 crc kubenswrapper[4693]: I1125 12:45:03.393994 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" event={"ID":"cb9e0867-f6aa-43ca-b148-a2850b65ab16","Type":"ContainerDied","Data":"c6620eb02c7802159f78327cab72109548c506f7f2de67b782a9ffe274f019b3"} Nov 25 12:45:03 crc kubenswrapper[4693]: I1125 12:45:03.394044 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6620eb02c7802159f78327cab72109548c506f7f2de67b782a9ffe274f019b3" Nov 25 12:45:03 crc kubenswrapper[4693]: I1125 12:45:03.394044 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f" Nov 25 12:45:03 crc kubenswrapper[4693]: I1125 12:45:03.830596 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg"] Nov 25 12:45:03 crc kubenswrapper[4693]: I1125 12:45:03.839197 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401200-p77sg"] Nov 25 12:45:04 crc kubenswrapper[4693]: I1125 12:45:04.843675 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="438f718d-ef67-42c1-b624-7d69c5e6b13f" path="/var/lib/kubelet/pods/438f718d-ef67-42c1-b624-7d69c5e6b13f/volumes" Nov 25 12:45:05 crc kubenswrapper[4693]: I1125 12:45:05.113811 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:45:05 crc kubenswrapper[4693]: I1125 12:45:05.113893 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:45:16 crc kubenswrapper[4693]: I1125 12:45:16.870330 4693 generic.go:334] "Generic (PLEG): container finished" podID="3e1334ad-6a95-4d72-95a2-5dfa8d78e530" containerID="73334df6cbea10f2c085673135f35db684b49bcb066572456a32c17a5f92fbfb" exitCode=0 Nov 25 12:45:16 crc kubenswrapper[4693]: I1125 12:45:16.870406 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" event={"ID":"3e1334ad-6a95-4d72-95a2-5dfa8d78e530","Type":"ContainerDied","Data":"73334df6cbea10f2c085673135f35db684b49bcb066572456a32c17a5f92fbfb"} Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.359287 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.497491 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-inventory\") pod \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.497565 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-neutron-metadata-combined-ca-bundle\") pod \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.497661 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-ssh-key\") pod \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.497705 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-neutron-ovn-metadata-agent-neutron-config-0\") pod \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.497793 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpgnw\" (UniqueName: \"kubernetes.io/projected/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-kube-api-access-kpgnw\") pod \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.497845 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-nova-metadata-neutron-config-0\") pod \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\" (UID: \"3e1334ad-6a95-4d72-95a2-5dfa8d78e530\") " Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.509797 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3e1334ad-6a95-4d72-95a2-5dfa8d78e530" (UID: "3e1334ad-6a95-4d72-95a2-5dfa8d78e530"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.512329 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-kube-api-access-kpgnw" (OuterVolumeSpecName: "kube-api-access-kpgnw") pod "3e1334ad-6a95-4d72-95a2-5dfa8d78e530" (UID: "3e1334ad-6a95-4d72-95a2-5dfa8d78e530"). InnerVolumeSpecName "kube-api-access-kpgnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.530709 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "3e1334ad-6a95-4d72-95a2-5dfa8d78e530" (UID: "3e1334ad-6a95-4d72-95a2-5dfa8d78e530"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.530895 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3e1334ad-6a95-4d72-95a2-5dfa8d78e530" (UID: "3e1334ad-6a95-4d72-95a2-5dfa8d78e530"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.532205 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "3e1334ad-6a95-4d72-95a2-5dfa8d78e530" (UID: "3e1334ad-6a95-4d72-95a2-5dfa8d78e530"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.532729 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-inventory" (OuterVolumeSpecName: "inventory") pod "3e1334ad-6a95-4d72-95a2-5dfa8d78e530" (UID: "3e1334ad-6a95-4d72-95a2-5dfa8d78e530"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.600466 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.600520 4693 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.600542 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.600562 4693 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.600584 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpgnw\" (UniqueName: \"kubernetes.io/projected/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-kube-api-access-kpgnw\") on node \"crc\" DevicePath \"\"" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.600603 4693 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3e1334ad-6a95-4d72-95a2-5dfa8d78e530-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.891468 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" event={"ID":"3e1334ad-6a95-4d72-95a2-5dfa8d78e530","Type":"ContainerDied","Data":"00c1edaace01e41771f36dfaf615f787d074f39cdf2497dfc164c9cb62a35bfa"} Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.891511 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00c1edaace01e41771f36dfaf615f787d074f39cdf2497dfc164c9cb62a35bfa" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.891554 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.981992 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62"] Nov 25 12:45:18 crc kubenswrapper[4693]: E1125 12:45:18.982383 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9e0867-f6aa-43ca-b148-a2850b65ab16" containerName="collect-profiles" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.982398 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9e0867-f6aa-43ca-b148-a2850b65ab16" containerName="collect-profiles" Nov 25 12:45:18 crc kubenswrapper[4693]: E1125 12:45:18.982428 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1334ad-6a95-4d72-95a2-5dfa8d78e530" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.982435 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1334ad-6a95-4d72-95a2-5dfa8d78e530" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.982596 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb9e0867-f6aa-43ca-b148-a2850b65ab16" containerName="collect-profiles" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.982610 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e1334ad-6a95-4d72-95a2-5dfa8d78e530" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.983219 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.985692 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.985866 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.986051 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.986302 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:45:18 crc kubenswrapper[4693]: I1125 12:45:18.988719 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.004413 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62"] Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.109681 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.109760 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.109797 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.110920 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.111226 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcmn7\" (UniqueName: \"kubernetes.io/projected/df6120d2-3571-4059-8fb1-d40741960cff-kube-api-access-vcmn7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.212972 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcmn7\" (UniqueName: \"kubernetes.io/projected/df6120d2-3571-4059-8fb1-d40741960cff-kube-api-access-vcmn7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.213381 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.213415 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.213459 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.213486 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.218787 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.218906 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.219609 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.220700 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.232356 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcmn7\" (UniqueName: \"kubernetes.io/projected/df6120d2-3571-4059-8fb1-d40741960cff-kube-api-access-vcmn7\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-njx62\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.310972 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.880558 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62"] Nov 25 12:45:19 crc kubenswrapper[4693]: I1125 12:45:19.902678 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" event={"ID":"df6120d2-3571-4059-8fb1-d40741960cff","Type":"ContainerStarted","Data":"45cdd042b755411bb6148ea1e4855ec28affd187d0c7e14fc9674c480c9dc469"} Nov 25 12:45:20 crc kubenswrapper[4693]: I1125 12:45:20.914780 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" event={"ID":"df6120d2-3571-4059-8fb1-d40741960cff","Type":"ContainerStarted","Data":"78d43bcae8978e6422148b74879c20d0d530e4de4d57ac54b18f74168c001a71"} Nov 25 12:45:20 crc kubenswrapper[4693]: I1125 12:45:20.933721 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" podStartSLOduration=2.3119313200000002 podStartE2EDuration="2.933701437s" podCreationTimestamp="2025-11-25 12:45:18 +0000 UTC" firstStartedPulling="2025-11-25 12:45:19.891776943 +0000 UTC m=+2239.809862324" lastFinishedPulling="2025-11-25 12:45:20.51354706 +0000 UTC m=+2240.431632441" observedRunningTime="2025-11-25 12:45:20.931475247 +0000 UTC m=+2240.849560628" watchObservedRunningTime="2025-11-25 12:45:20.933701437 +0000 UTC m=+2240.851786818" Nov 25 12:45:28 crc kubenswrapper[4693]: I1125 12:45:28.480618 4693 scope.go:117] "RemoveContainer" containerID="fc09e76ed5002e18747cc585fb18fd30f052755db3f531c3ab13577b124cbc10" Nov 25 12:45:35 crc kubenswrapper[4693]: I1125 12:45:35.114195 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:45:35 crc kubenswrapper[4693]: I1125 12:45:35.114877 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:46:05 crc kubenswrapper[4693]: I1125 12:46:05.114209 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:46:05 crc kubenswrapper[4693]: I1125 12:46:05.114782 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:46:05 crc kubenswrapper[4693]: I1125 12:46:05.114828 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:46:05 crc kubenswrapper[4693]: I1125 12:46:05.115563 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 12:46:05 crc kubenswrapper[4693]: I1125 12:46:05.115631 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" gracePeriod=600 Nov 25 12:46:05 crc kubenswrapper[4693]: E1125 12:46:05.254832 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:46:05 crc kubenswrapper[4693]: I1125 12:46:05.345600 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" exitCode=0 Nov 25 12:46:05 crc kubenswrapper[4693]: I1125 12:46:05.345671 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0"} Nov 25 12:46:05 crc kubenswrapper[4693]: I1125 12:46:05.345737 4693 scope.go:117] "RemoveContainer" containerID="be2aba682d9474189f085318cf98009d0155e53628b3d276f5e0ba4c49edb9d9" Nov 25 12:46:05 crc kubenswrapper[4693]: I1125 12:46:05.346451 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:46:05 crc kubenswrapper[4693]: E1125 12:46:05.346860 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:46:17 crc kubenswrapper[4693]: I1125 12:46:17.813741 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:46:17 crc kubenswrapper[4693]: E1125 12:46:17.814866 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:46:30 crc kubenswrapper[4693]: I1125 12:46:30.820526 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:46:30 crc kubenswrapper[4693]: E1125 12:46:30.821406 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:46:42 crc kubenswrapper[4693]: I1125 12:46:42.813320 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:46:42 crc kubenswrapper[4693]: E1125 12:46:42.814073 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:46:45 crc kubenswrapper[4693]: I1125 12:46:45.956906 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vbps2"] Nov 25 12:46:45 crc kubenswrapper[4693]: I1125 12:46:45.963808 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:45 crc kubenswrapper[4693]: I1125 12:46:45.969218 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbps2"] Nov 25 12:46:46 crc kubenswrapper[4693]: I1125 12:46:46.036003 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqb97\" (UniqueName: \"kubernetes.io/projected/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-kube-api-access-dqb97\") pod \"community-operators-vbps2\" (UID: \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\") " pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:46 crc kubenswrapper[4693]: I1125 12:46:46.036146 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-catalog-content\") pod \"community-operators-vbps2\" (UID: \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\") " pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:46 crc kubenswrapper[4693]: I1125 12:46:46.036179 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-utilities\") pod \"community-operators-vbps2\" (UID: \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\") " pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:46 crc kubenswrapper[4693]: I1125 12:46:46.138415 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-catalog-content\") pod \"community-operators-vbps2\" (UID: \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\") " pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:46 crc kubenswrapper[4693]: I1125 12:46:46.138482 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-utilities\") pod \"community-operators-vbps2\" (UID: \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\") " pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:46 crc kubenswrapper[4693]: I1125 12:46:46.138585 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqb97\" (UniqueName: \"kubernetes.io/projected/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-kube-api-access-dqb97\") pod \"community-operators-vbps2\" (UID: \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\") " pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:46 crc kubenswrapper[4693]: I1125 12:46:46.138887 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-catalog-content\") pod \"community-operators-vbps2\" (UID: \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\") " pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:46 crc kubenswrapper[4693]: I1125 12:46:46.138898 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-utilities\") pod \"community-operators-vbps2\" (UID: \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\") " pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:46 crc kubenswrapper[4693]: I1125 12:46:46.161302 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqb97\" (UniqueName: \"kubernetes.io/projected/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-kube-api-access-dqb97\") pod \"community-operators-vbps2\" (UID: \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\") " pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:46 crc kubenswrapper[4693]: I1125 12:46:46.311066 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:46 crc kubenswrapper[4693]: I1125 12:46:46.858875 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbps2"] Nov 25 12:46:47 crc kubenswrapper[4693]: I1125 12:46:47.790302 4693 generic.go:334] "Generic (PLEG): container finished" podID="ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" containerID="b243845fde4ff3b2b1a0595a7d1070738aaf806878bd8d17f6792bb493a79697" exitCode=0 Nov 25 12:46:47 crc kubenswrapper[4693]: I1125 12:46:47.790341 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbps2" event={"ID":"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206","Type":"ContainerDied","Data":"b243845fde4ff3b2b1a0595a7d1070738aaf806878bd8d17f6792bb493a79697"} Nov 25 12:46:47 crc kubenswrapper[4693]: I1125 12:46:47.790613 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbps2" event={"ID":"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206","Type":"ContainerStarted","Data":"3a7c344fd48143496d239cf4356384dee5dfaae8283e52e63c6dfce0f1910065"} Nov 25 12:46:47 crc kubenswrapper[4693]: I1125 12:46:47.794149 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 12:46:48 crc kubenswrapper[4693]: I1125 12:46:48.803773 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbps2" event={"ID":"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206","Type":"ContainerStarted","Data":"b338f92a34abfa46d9e79f1953a66d0cf098f8b9b303b34ba2ddf132f00be2e0"} Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.738626 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c984f"] Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.740902 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.765300 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c984f"] Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.809927 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2f9fc20-0403-485f-9bee-48b7826f8110-utilities\") pod \"certified-operators-c984f\" (UID: \"b2f9fc20-0403-485f-9bee-48b7826f8110\") " pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.810354 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2f9fc20-0403-485f-9bee-48b7826f8110-catalog-content\") pod \"certified-operators-c984f\" (UID: \"b2f9fc20-0403-485f-9bee-48b7826f8110\") " pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.810429 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4d8l\" (UniqueName: \"kubernetes.io/projected/b2f9fc20-0403-485f-9bee-48b7826f8110-kube-api-access-p4d8l\") pod \"certified-operators-c984f\" (UID: \"b2f9fc20-0403-485f-9bee-48b7826f8110\") " pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.817397 4693 generic.go:334] "Generic (PLEG): container finished" podID="ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" containerID="b338f92a34abfa46d9e79f1953a66d0cf098f8b9b303b34ba2ddf132f00be2e0" exitCode=0 Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.817440 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbps2" event={"ID":"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206","Type":"ContainerDied","Data":"b338f92a34abfa46d9e79f1953a66d0cf098f8b9b303b34ba2ddf132f00be2e0"} Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.913966 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2f9fc20-0403-485f-9bee-48b7826f8110-utilities\") pod \"certified-operators-c984f\" (UID: \"b2f9fc20-0403-485f-9bee-48b7826f8110\") " pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.913999 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2f9fc20-0403-485f-9bee-48b7826f8110-utilities\") pod \"certified-operators-c984f\" (UID: \"b2f9fc20-0403-485f-9bee-48b7826f8110\") " pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.914723 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2f9fc20-0403-485f-9bee-48b7826f8110-catalog-content\") pod \"certified-operators-c984f\" (UID: \"b2f9fc20-0403-485f-9bee-48b7826f8110\") " pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.915098 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4d8l\" (UniqueName: \"kubernetes.io/projected/b2f9fc20-0403-485f-9bee-48b7826f8110-kube-api-access-p4d8l\") pod \"certified-operators-c984f\" (UID: \"b2f9fc20-0403-485f-9bee-48b7826f8110\") " pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.915176 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2f9fc20-0403-485f-9bee-48b7826f8110-catalog-content\") pod \"certified-operators-c984f\" (UID: \"b2f9fc20-0403-485f-9bee-48b7826f8110\") " pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:46:49 crc kubenswrapper[4693]: I1125 12:46:49.938305 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4d8l\" (UniqueName: \"kubernetes.io/projected/b2f9fc20-0403-485f-9bee-48b7826f8110-kube-api-access-p4d8l\") pod \"certified-operators-c984f\" (UID: \"b2f9fc20-0403-485f-9bee-48b7826f8110\") " pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:46:50 crc kubenswrapper[4693]: I1125 12:46:50.088645 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:46:50 crc kubenswrapper[4693]: I1125 12:46:50.596706 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c984f"] Nov 25 12:46:50 crc kubenswrapper[4693]: I1125 12:46:50.837172 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbps2" event={"ID":"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206","Type":"ContainerStarted","Data":"8098060e4028a02b576c8f22b2929f8931610b1e1c694cc5ea03966d37b128a7"} Nov 25 12:46:50 crc kubenswrapper[4693]: I1125 12:46:50.840211 4693 generic.go:334] "Generic (PLEG): container finished" podID="b2f9fc20-0403-485f-9bee-48b7826f8110" containerID="3f9447e1720fad0bb89c04e74acb39ac950dd1e67f7c906b9724bb3399349fa6" exitCode=0 Nov 25 12:46:50 crc kubenswrapper[4693]: I1125 12:46:50.840269 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c984f" event={"ID":"b2f9fc20-0403-485f-9bee-48b7826f8110","Type":"ContainerDied","Data":"3f9447e1720fad0bb89c04e74acb39ac950dd1e67f7c906b9724bb3399349fa6"} Nov 25 12:46:50 crc kubenswrapper[4693]: I1125 12:46:50.840306 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c984f" event={"ID":"b2f9fc20-0403-485f-9bee-48b7826f8110","Type":"ContainerStarted","Data":"fa65d6a1d667271f777a125c56f178ab50915bd19a0860cf11d4322e3167ac3c"} Nov 25 12:46:50 crc kubenswrapper[4693]: I1125 12:46:50.921680 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vbps2" podStartSLOduration=3.350594193 podStartE2EDuration="5.921655623s" podCreationTimestamp="2025-11-25 12:46:45 +0000 UTC" firstStartedPulling="2025-11-25 12:46:47.793814516 +0000 UTC m=+2327.711899897" lastFinishedPulling="2025-11-25 12:46:50.364875946 +0000 UTC m=+2330.282961327" observedRunningTime="2025-11-25 12:46:50.878392612 +0000 UTC m=+2330.796478003" watchObservedRunningTime="2025-11-25 12:46:50.921655623 +0000 UTC m=+2330.839741014" Nov 25 12:46:51 crc kubenswrapper[4693]: I1125 12:46:51.858847 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c984f" event={"ID":"b2f9fc20-0403-485f-9bee-48b7826f8110","Type":"ContainerStarted","Data":"63b75c4aa6d78b3ba512365bd90971f2baacb6f80b2223df0c088316b44598d0"} Nov 25 12:46:52 crc kubenswrapper[4693]: I1125 12:46:52.868066 4693 generic.go:334] "Generic (PLEG): container finished" podID="b2f9fc20-0403-485f-9bee-48b7826f8110" containerID="63b75c4aa6d78b3ba512365bd90971f2baacb6f80b2223df0c088316b44598d0" exitCode=0 Nov 25 12:46:52 crc kubenswrapper[4693]: I1125 12:46:52.868160 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c984f" event={"ID":"b2f9fc20-0403-485f-9bee-48b7826f8110","Type":"ContainerDied","Data":"63b75c4aa6d78b3ba512365bd90971f2baacb6f80b2223df0c088316b44598d0"} Nov 25 12:46:53 crc kubenswrapper[4693]: I1125 12:46:53.882234 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c984f" event={"ID":"b2f9fc20-0403-485f-9bee-48b7826f8110","Type":"ContainerStarted","Data":"3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f"} Nov 25 12:46:53 crc kubenswrapper[4693]: I1125 12:46:53.910345 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c984f" podStartSLOduration=2.510224181 podStartE2EDuration="4.910323302s" podCreationTimestamp="2025-11-25 12:46:49 +0000 UTC" firstStartedPulling="2025-11-25 12:46:50.84212684 +0000 UTC m=+2330.760212221" lastFinishedPulling="2025-11-25 12:46:53.242225961 +0000 UTC m=+2333.160311342" observedRunningTime="2025-11-25 12:46:53.903300242 +0000 UTC m=+2333.821385623" watchObservedRunningTime="2025-11-25 12:46:53.910323302 +0000 UTC m=+2333.828408693" Nov 25 12:46:56 crc kubenswrapper[4693]: I1125 12:46:56.311447 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:56 crc kubenswrapper[4693]: I1125 12:46:56.312732 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:56 crc kubenswrapper[4693]: I1125 12:46:56.380660 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:56 crc kubenswrapper[4693]: I1125 12:46:56.983107 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:46:57 crc kubenswrapper[4693]: I1125 12:46:57.813978 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:46:57 crc kubenswrapper[4693]: E1125 12:46:57.814264 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:46:58 crc kubenswrapper[4693]: I1125 12:46:58.124722 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbps2"] Nov 25 12:46:58 crc kubenswrapper[4693]: I1125 12:46:58.935176 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vbps2" podUID="ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" containerName="registry-server" containerID="cri-o://8098060e4028a02b576c8f22b2929f8931610b1e1c694cc5ea03966d37b128a7" gracePeriod=2 Nov 25 12:46:59 crc kubenswrapper[4693]: I1125 12:46:59.953223 4693 generic.go:334] "Generic (PLEG): container finished" podID="ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" containerID="8098060e4028a02b576c8f22b2929f8931610b1e1c694cc5ea03966d37b128a7" exitCode=0 Nov 25 12:46:59 crc kubenswrapper[4693]: I1125 12:46:59.953269 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbps2" event={"ID":"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206","Type":"ContainerDied","Data":"8098060e4028a02b576c8f22b2929f8931610b1e1c694cc5ea03966d37b128a7"} Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.089695 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.090028 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.139071 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.165121 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.314243 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqb97\" (UniqueName: \"kubernetes.io/projected/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-kube-api-access-dqb97\") pod \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\" (UID: \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\") " Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.314295 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-catalog-content\") pod \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\" (UID: \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\") " Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.314426 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-utilities\") pod \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\" (UID: \"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206\") " Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.315322 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-utilities" (OuterVolumeSpecName: "utilities") pod "ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" (UID: "ce3c9e8c-5c75-42ae-9cd7-839ca40ed206"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.325613 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-kube-api-access-dqb97" (OuterVolumeSpecName: "kube-api-access-dqb97") pod "ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" (UID: "ce3c9e8c-5c75-42ae-9cd7-839ca40ed206"). InnerVolumeSpecName "kube-api-access-dqb97". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.379330 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" (UID: "ce3c9e8c-5c75-42ae-9cd7-839ca40ed206"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.416574 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqb97\" (UniqueName: \"kubernetes.io/projected/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-kube-api-access-dqb97\") on node \"crc\" DevicePath \"\"" Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.416618 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.416635 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.965152 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbps2" Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.965298 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbps2" event={"ID":"ce3c9e8c-5c75-42ae-9cd7-839ca40ed206","Type":"ContainerDied","Data":"3a7c344fd48143496d239cf4356384dee5dfaae8283e52e63c6dfce0f1910065"} Nov 25 12:47:00 crc kubenswrapper[4693]: I1125 12:47:00.965445 4693 scope.go:117] "RemoveContainer" containerID="8098060e4028a02b576c8f22b2929f8931610b1e1c694cc5ea03966d37b128a7" Nov 25 12:47:01 crc kubenswrapper[4693]: I1125 12:47:01.001740 4693 scope.go:117] "RemoveContainer" containerID="b338f92a34abfa46d9e79f1953a66d0cf098f8b9b303b34ba2ddf132f00be2e0" Nov 25 12:47:01 crc kubenswrapper[4693]: I1125 12:47:01.013196 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbps2"] Nov 25 12:47:01 crc kubenswrapper[4693]: I1125 12:47:01.023793 4693 scope.go:117] "RemoveContainer" containerID="b243845fde4ff3b2b1a0595a7d1070738aaf806878bd8d17f6792bb493a79697" Nov 25 12:47:01 crc kubenswrapper[4693]: I1125 12:47:01.028269 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vbps2"] Nov 25 12:47:01 crc kubenswrapper[4693]: I1125 12:47:01.046242 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:47:02 crc kubenswrapper[4693]: I1125 12:47:02.831520 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" path="/var/lib/kubelet/pods/ce3c9e8c-5c75-42ae-9cd7-839ca40ed206/volumes" Nov 25 12:47:03 crc kubenswrapper[4693]: I1125 12:47:03.328927 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c984f"] Nov 25 12:47:03 crc kubenswrapper[4693]: I1125 12:47:03.329217 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c984f" podUID="b2f9fc20-0403-485f-9bee-48b7826f8110" containerName="registry-server" containerID="cri-o://3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f" gracePeriod=2 Nov 25 12:47:03 crc kubenswrapper[4693]: I1125 12:47:03.761131 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:47:03 crc kubenswrapper[4693]: I1125 12:47:03.888607 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2f9fc20-0403-485f-9bee-48b7826f8110-utilities\") pod \"b2f9fc20-0403-485f-9bee-48b7826f8110\" (UID: \"b2f9fc20-0403-485f-9bee-48b7826f8110\") " Nov 25 12:47:03 crc kubenswrapper[4693]: I1125 12:47:03.888721 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2f9fc20-0403-485f-9bee-48b7826f8110-catalog-content\") pod \"b2f9fc20-0403-485f-9bee-48b7826f8110\" (UID: \"b2f9fc20-0403-485f-9bee-48b7826f8110\") " Nov 25 12:47:03 crc kubenswrapper[4693]: I1125 12:47:03.888888 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4d8l\" (UniqueName: \"kubernetes.io/projected/b2f9fc20-0403-485f-9bee-48b7826f8110-kube-api-access-p4d8l\") pod \"b2f9fc20-0403-485f-9bee-48b7826f8110\" (UID: \"b2f9fc20-0403-485f-9bee-48b7826f8110\") " Nov 25 12:47:03 crc kubenswrapper[4693]: I1125 12:47:03.891807 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2f9fc20-0403-485f-9bee-48b7826f8110-utilities" (OuterVolumeSpecName: "utilities") pod "b2f9fc20-0403-485f-9bee-48b7826f8110" (UID: "b2f9fc20-0403-485f-9bee-48b7826f8110"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:47:03 crc kubenswrapper[4693]: I1125 12:47:03.896745 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2f9fc20-0403-485f-9bee-48b7826f8110-kube-api-access-p4d8l" (OuterVolumeSpecName: "kube-api-access-p4d8l") pod "b2f9fc20-0403-485f-9bee-48b7826f8110" (UID: "b2f9fc20-0403-485f-9bee-48b7826f8110"). InnerVolumeSpecName "kube-api-access-p4d8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:47:03 crc kubenswrapper[4693]: I1125 12:47:03.949923 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2f9fc20-0403-485f-9bee-48b7826f8110-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2f9fc20-0403-485f-9bee-48b7826f8110" (UID: "b2f9fc20-0403-485f-9bee-48b7826f8110"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:47:03 crc kubenswrapper[4693]: I1125 12:47:03.991121 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4d8l\" (UniqueName: \"kubernetes.io/projected/b2f9fc20-0403-485f-9bee-48b7826f8110-kube-api-access-p4d8l\") on node \"crc\" DevicePath \"\"" Nov 25 12:47:03 crc kubenswrapper[4693]: I1125 12:47:03.991155 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2f9fc20-0403-485f-9bee-48b7826f8110-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:47:03 crc kubenswrapper[4693]: I1125 12:47:03.991165 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2f9fc20-0403-485f-9bee-48b7826f8110-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.018282 4693 generic.go:334] "Generic (PLEG): container finished" podID="b2f9fc20-0403-485f-9bee-48b7826f8110" containerID="3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f" exitCode=0 Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.018391 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c984f" Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.018357 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c984f" event={"ID":"b2f9fc20-0403-485f-9bee-48b7826f8110","Type":"ContainerDied","Data":"3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f"} Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.018465 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c984f" event={"ID":"b2f9fc20-0403-485f-9bee-48b7826f8110","Type":"ContainerDied","Data":"fa65d6a1d667271f777a125c56f178ab50915bd19a0860cf11d4322e3167ac3c"} Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.018491 4693 scope.go:117] "RemoveContainer" containerID="3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f" Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.048166 4693 scope.go:117] "RemoveContainer" containerID="63b75c4aa6d78b3ba512365bd90971f2baacb6f80b2223df0c088316b44598d0" Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.056146 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c984f"] Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.064466 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c984f"] Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.069520 4693 scope.go:117] "RemoveContainer" containerID="3f9447e1720fad0bb89c04e74acb39ac950dd1e67f7c906b9724bb3399349fa6" Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.119726 4693 scope.go:117] "RemoveContainer" containerID="3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f" Nov 25 12:47:04 crc kubenswrapper[4693]: E1125 12:47:04.120144 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f\": container with ID starting with 3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f not found: ID does not exist" containerID="3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f" Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.120187 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f"} err="failed to get container status \"3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f\": rpc error: code = NotFound desc = could not find container \"3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f\": container with ID starting with 3489a3fee687bba1cdb1a69cc7f35e27632bfa569639d134be02042638da1d2f not found: ID does not exist" Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.120213 4693 scope.go:117] "RemoveContainer" containerID="63b75c4aa6d78b3ba512365bd90971f2baacb6f80b2223df0c088316b44598d0" Nov 25 12:47:04 crc kubenswrapper[4693]: E1125 12:47:04.120714 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b75c4aa6d78b3ba512365bd90971f2baacb6f80b2223df0c088316b44598d0\": container with ID starting with 63b75c4aa6d78b3ba512365bd90971f2baacb6f80b2223df0c088316b44598d0 not found: ID does not exist" containerID="63b75c4aa6d78b3ba512365bd90971f2baacb6f80b2223df0c088316b44598d0" Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.120760 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b75c4aa6d78b3ba512365bd90971f2baacb6f80b2223df0c088316b44598d0"} err="failed to get container status \"63b75c4aa6d78b3ba512365bd90971f2baacb6f80b2223df0c088316b44598d0\": rpc error: code = NotFound desc = could not find container \"63b75c4aa6d78b3ba512365bd90971f2baacb6f80b2223df0c088316b44598d0\": container with ID starting with 63b75c4aa6d78b3ba512365bd90971f2baacb6f80b2223df0c088316b44598d0 not found: ID does not exist" Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.120788 4693 scope.go:117] "RemoveContainer" containerID="3f9447e1720fad0bb89c04e74acb39ac950dd1e67f7c906b9724bb3399349fa6" Nov 25 12:47:04 crc kubenswrapper[4693]: E1125 12:47:04.121164 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f9447e1720fad0bb89c04e74acb39ac950dd1e67f7c906b9724bb3399349fa6\": container with ID starting with 3f9447e1720fad0bb89c04e74acb39ac950dd1e67f7c906b9724bb3399349fa6 not found: ID does not exist" containerID="3f9447e1720fad0bb89c04e74acb39ac950dd1e67f7c906b9724bb3399349fa6" Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.121192 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9447e1720fad0bb89c04e74acb39ac950dd1e67f7c906b9724bb3399349fa6"} err="failed to get container status \"3f9447e1720fad0bb89c04e74acb39ac950dd1e67f7c906b9724bb3399349fa6\": rpc error: code = NotFound desc = could not find container \"3f9447e1720fad0bb89c04e74acb39ac950dd1e67f7c906b9724bb3399349fa6\": container with ID starting with 3f9447e1720fad0bb89c04e74acb39ac950dd1e67f7c906b9724bb3399349fa6 not found: ID does not exist" Nov 25 12:47:04 crc kubenswrapper[4693]: I1125 12:47:04.834510 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2f9fc20-0403-485f-9bee-48b7826f8110" path="/var/lib/kubelet/pods/b2f9fc20-0403-485f-9bee-48b7826f8110/volumes" Nov 25 12:47:10 crc kubenswrapper[4693]: E1125 12:47:10.016758 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce3c9e8c_5c75_42ae_9cd7_839ca40ed206.slice\": RecentStats: unable to find data in memory cache]" Nov 25 12:47:12 crc kubenswrapper[4693]: I1125 12:47:12.813153 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:47:12 crc kubenswrapper[4693]: E1125 12:47:12.813746 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:47:20 crc kubenswrapper[4693]: E1125 12:47:20.277061 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce3c9e8c_5c75_42ae_9cd7_839ca40ed206.slice\": RecentStats: unable to find data in memory cache]" Nov 25 12:47:26 crc kubenswrapper[4693]: I1125 12:47:26.813347 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:47:26 crc kubenswrapper[4693]: E1125 12:47:26.814438 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:47:30 crc kubenswrapper[4693]: E1125 12:47:30.516859 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce3c9e8c_5c75_42ae_9cd7_839ca40ed206.slice\": RecentStats: unable to find data in memory cache]" Nov 25 12:47:38 crc kubenswrapper[4693]: I1125 12:47:38.813025 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:47:38 crc kubenswrapper[4693]: E1125 12:47:38.814547 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:47:40 crc kubenswrapper[4693]: E1125 12:47:40.817523 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce3c9e8c_5c75_42ae_9cd7_839ca40ed206.slice\": RecentStats: unable to find data in memory cache]" Nov 25 12:47:49 crc kubenswrapper[4693]: I1125 12:47:49.813039 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:47:49 crc kubenswrapper[4693]: E1125 12:47:49.814224 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:47:51 crc kubenswrapper[4693]: E1125 12:47:51.167573 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce3c9e8c_5c75_42ae_9cd7_839ca40ed206.slice\": RecentStats: unable to find data in memory cache]" Nov 25 12:48:03 crc kubenswrapper[4693]: I1125 12:48:03.813363 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:48:03 crc kubenswrapper[4693]: E1125 12:48:03.814505 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:48:15 crc kubenswrapper[4693]: I1125 12:48:14.813043 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:48:15 crc kubenswrapper[4693]: E1125 12:48:14.813886 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:48:26 crc kubenswrapper[4693]: I1125 12:48:26.812917 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:48:26 crc kubenswrapper[4693]: E1125 12:48:26.813765 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:48:38 crc kubenswrapper[4693]: I1125 12:48:38.812948 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:48:38 crc kubenswrapper[4693]: E1125 12:48:38.813712 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:48:50 crc kubenswrapper[4693]: I1125 12:48:50.831761 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:48:50 crc kubenswrapper[4693]: E1125 12:48:50.833065 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:49:05 crc kubenswrapper[4693]: I1125 12:49:05.813619 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:49:05 crc kubenswrapper[4693]: E1125 12:49:05.814490 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:49:16 crc kubenswrapper[4693]: I1125 12:49:16.812929 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:49:16 crc kubenswrapper[4693]: E1125 12:49:16.813771 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:49:27 crc kubenswrapper[4693]: I1125 12:49:27.813635 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:49:27 crc kubenswrapper[4693]: E1125 12:49:27.814514 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:49:41 crc kubenswrapper[4693]: I1125 12:49:41.812637 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:49:41 crc kubenswrapper[4693]: E1125 12:49:41.813433 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:49:56 crc kubenswrapper[4693]: I1125 12:49:56.813642 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:49:56 crc kubenswrapper[4693]: E1125 12:49:56.814397 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:50:00 crc kubenswrapper[4693]: I1125 12:50:00.142804 4693 generic.go:334] "Generic (PLEG): container finished" podID="df6120d2-3571-4059-8fb1-d40741960cff" containerID="78d43bcae8978e6422148b74879c20d0d530e4de4d57ac54b18f74168c001a71" exitCode=0 Nov 25 12:50:00 crc kubenswrapper[4693]: I1125 12:50:00.142889 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" event={"ID":"df6120d2-3571-4059-8fb1-d40741960cff","Type":"ContainerDied","Data":"78d43bcae8978e6422148b74879c20d0d530e4de4d57ac54b18f74168c001a71"} Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.565981 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.681341 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-libvirt-secret-0\") pod \"df6120d2-3571-4059-8fb1-d40741960cff\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.681475 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-inventory\") pod \"df6120d2-3571-4059-8fb1-d40741960cff\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.681558 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcmn7\" (UniqueName: \"kubernetes.io/projected/df6120d2-3571-4059-8fb1-d40741960cff-kube-api-access-vcmn7\") pod \"df6120d2-3571-4059-8fb1-d40741960cff\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.681583 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-ssh-key\") pod \"df6120d2-3571-4059-8fb1-d40741960cff\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.681626 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-libvirt-combined-ca-bundle\") pod \"df6120d2-3571-4059-8fb1-d40741960cff\" (UID: \"df6120d2-3571-4059-8fb1-d40741960cff\") " Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.687854 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "df6120d2-3571-4059-8fb1-d40741960cff" (UID: "df6120d2-3571-4059-8fb1-d40741960cff"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.690124 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6120d2-3571-4059-8fb1-d40741960cff-kube-api-access-vcmn7" (OuterVolumeSpecName: "kube-api-access-vcmn7") pod "df6120d2-3571-4059-8fb1-d40741960cff" (UID: "df6120d2-3571-4059-8fb1-d40741960cff"). InnerVolumeSpecName "kube-api-access-vcmn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.711723 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "df6120d2-3571-4059-8fb1-d40741960cff" (UID: "df6120d2-3571-4059-8fb1-d40741960cff"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.714148 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-inventory" (OuterVolumeSpecName: "inventory") pod "df6120d2-3571-4059-8fb1-d40741960cff" (UID: "df6120d2-3571-4059-8fb1-d40741960cff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.716590 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "df6120d2-3571-4059-8fb1-d40741960cff" (UID: "df6120d2-3571-4059-8fb1-d40741960cff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.785627 4693 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.785941 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.786031 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcmn7\" (UniqueName: \"kubernetes.io/projected/df6120d2-3571-4059-8fb1-d40741960cff-kube-api-access-vcmn7\") on node \"crc\" DevicePath \"\"" Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.786108 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:50:01 crc kubenswrapper[4693]: I1125 12:50:01.786185 4693 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6120d2-3571-4059-8fb1-d40741960cff-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.182407 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" event={"ID":"df6120d2-3571-4059-8fb1-d40741960cff","Type":"ContainerDied","Data":"45cdd042b755411bb6148ea1e4855ec28affd187d0c7e14fc9674c480c9dc469"} Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.182667 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45cdd042b755411bb6148ea1e4855ec28affd187d0c7e14fc9674c480c9dc469" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.182627 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-njx62" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.270347 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm"] Nov 25 12:50:02 crc kubenswrapper[4693]: E1125 12:50:02.270765 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6120d2-3571-4059-8fb1-d40741960cff" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.270781 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6120d2-3571-4059-8fb1-d40741960cff" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 25 12:50:02 crc kubenswrapper[4693]: E1125 12:50:02.270792 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f9fc20-0403-485f-9bee-48b7826f8110" containerName="extract-content" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.270799 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f9fc20-0403-485f-9bee-48b7826f8110" containerName="extract-content" Nov 25 12:50:02 crc kubenswrapper[4693]: E1125 12:50:02.270826 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f9fc20-0403-485f-9bee-48b7826f8110" containerName="registry-server" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.270832 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f9fc20-0403-485f-9bee-48b7826f8110" containerName="registry-server" Nov 25 12:50:02 crc kubenswrapper[4693]: E1125 12:50:02.270843 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2f9fc20-0403-485f-9bee-48b7826f8110" containerName="extract-utilities" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.270850 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2f9fc20-0403-485f-9bee-48b7826f8110" containerName="extract-utilities" Nov 25 12:50:02 crc kubenswrapper[4693]: E1125 12:50:02.270865 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" containerName="extract-utilities" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.270870 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" containerName="extract-utilities" Nov 25 12:50:02 crc kubenswrapper[4693]: E1125 12:50:02.270885 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" containerName="registry-server" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.270890 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" containerName="registry-server" Nov 25 12:50:02 crc kubenswrapper[4693]: E1125 12:50:02.270900 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" containerName="extract-content" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.270906 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" containerName="extract-content" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.271098 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2f9fc20-0403-485f-9bee-48b7826f8110" containerName="registry-server" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.271109 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6120d2-3571-4059-8fb1-d40741960cff" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.271127 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3c9e8c-5c75-42ae-9cd7-839ca40ed206" containerName="registry-server" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.282776 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm"] Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.282881 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.285346 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.285996 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.286142 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.286316 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.286474 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.286594 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.286691 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.406791 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.406885 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.406912 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv9mf\" (UniqueName: \"kubernetes.io/projected/2a152944-4c08-47ee-bc41-90fa01d90bb1-kube-api-access-hv9mf\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.406975 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.407081 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.407118 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.407153 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.407176 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.407208 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.510029 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.510115 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.510173 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv9mf\" (UniqueName: \"kubernetes.io/projected/2a152944-4c08-47ee-bc41-90fa01d90bb1-kube-api-access-hv9mf\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.510229 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.510332 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.510579 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.510611 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.510635 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.511262 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.512186 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.515931 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.516392 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.517007 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.517875 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.518906 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.526077 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.530070 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.531829 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv9mf\" (UniqueName: \"kubernetes.io/projected/2a152944-4c08-47ee-bc41-90fa01d90bb1-kube-api-access-hv9mf\") pod \"nova-edpm-deployment-openstack-edpm-ipam-q4ddm\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:02 crc kubenswrapper[4693]: I1125 12:50:02.634144 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:50:03 crc kubenswrapper[4693]: I1125 12:50:03.191655 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm"] Nov 25 12:50:04 crc kubenswrapper[4693]: I1125 12:50:04.207471 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" event={"ID":"2a152944-4c08-47ee-bc41-90fa01d90bb1","Type":"ContainerStarted","Data":"c1ee06492e77501defb1ef016cde1bdb27874f42512a9907d301b488bf9f8ef6"} Nov 25 12:50:04 crc kubenswrapper[4693]: I1125 12:50:04.208030 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" event={"ID":"2a152944-4c08-47ee-bc41-90fa01d90bb1","Type":"ContainerStarted","Data":"cbca9fbf89a3e515b0932b73cdbcdbc4ec27e4ed1d62fede6332cd957dff5cc1"} Nov 25 12:50:04 crc kubenswrapper[4693]: I1125 12:50:04.226461 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" podStartSLOduration=1.737549029 podStartE2EDuration="2.226442968s" podCreationTimestamp="2025-11-25 12:50:02 +0000 UTC" firstStartedPulling="2025-11-25 12:50:03.198417538 +0000 UTC m=+2523.116502919" lastFinishedPulling="2025-11-25 12:50:03.687311477 +0000 UTC m=+2523.605396858" observedRunningTime="2025-11-25 12:50:04.225468252 +0000 UTC m=+2524.143553633" watchObservedRunningTime="2025-11-25 12:50:04.226442968 +0000 UTC m=+2524.144528349" Nov 25 12:50:09 crc kubenswrapper[4693]: I1125 12:50:09.813641 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:50:09 crc kubenswrapper[4693]: E1125 12:50:09.814767 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:50:24 crc kubenswrapper[4693]: I1125 12:50:24.813356 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:50:24 crc kubenswrapper[4693]: E1125 12:50:24.815316 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:50:38 crc kubenswrapper[4693]: I1125 12:50:38.813300 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:50:38 crc kubenswrapper[4693]: E1125 12:50:38.814659 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:50:50 crc kubenswrapper[4693]: I1125 12:50:50.819822 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:50:50 crc kubenswrapper[4693]: E1125 12:50:50.820644 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:51:03 crc kubenswrapper[4693]: I1125 12:51:03.812941 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:51:03 crc kubenswrapper[4693]: E1125 12:51:03.813895 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:51:18 crc kubenswrapper[4693]: I1125 12:51:18.813853 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:51:19 crc kubenswrapper[4693]: I1125 12:51:19.150010 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"a4dd9adf1a9f2c9cf5ccaa1e03409a5005de06e132b88ee5a7daeb3646667496"} Nov 25 12:53:22 crc kubenswrapper[4693]: I1125 12:53:22.400900 4693 generic.go:334] "Generic (PLEG): container finished" podID="2a152944-4c08-47ee-bc41-90fa01d90bb1" containerID="c1ee06492e77501defb1ef016cde1bdb27874f42512a9907d301b488bf9f8ef6" exitCode=0 Nov 25 12:53:22 crc kubenswrapper[4693]: I1125 12:53:22.401047 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" event={"ID":"2a152944-4c08-47ee-bc41-90fa01d90bb1","Type":"ContainerDied","Data":"c1ee06492e77501defb1ef016cde1bdb27874f42512a9907d301b488bf9f8ef6"} Nov 25 12:53:23 crc kubenswrapper[4693]: I1125 12:53:23.932642 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.124061 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-cell1-compute-config-1\") pod \"2a152944-4c08-47ee-bc41-90fa01d90bb1\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.124155 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-extra-config-0\") pod \"2a152944-4c08-47ee-bc41-90fa01d90bb1\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.124204 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-migration-ssh-key-1\") pod \"2a152944-4c08-47ee-bc41-90fa01d90bb1\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.124458 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-ssh-key\") pod \"2a152944-4c08-47ee-bc41-90fa01d90bb1\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.124524 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-migration-ssh-key-0\") pod \"2a152944-4c08-47ee-bc41-90fa01d90bb1\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.124569 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-inventory\") pod \"2a152944-4c08-47ee-bc41-90fa01d90bb1\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.124595 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv9mf\" (UniqueName: \"kubernetes.io/projected/2a152944-4c08-47ee-bc41-90fa01d90bb1-kube-api-access-hv9mf\") pod \"2a152944-4c08-47ee-bc41-90fa01d90bb1\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.124704 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-cell1-compute-config-0\") pod \"2a152944-4c08-47ee-bc41-90fa01d90bb1\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.124804 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-combined-ca-bundle\") pod \"2a152944-4c08-47ee-bc41-90fa01d90bb1\" (UID: \"2a152944-4c08-47ee-bc41-90fa01d90bb1\") " Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.130689 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a152944-4c08-47ee-bc41-90fa01d90bb1-kube-api-access-hv9mf" (OuterVolumeSpecName: "kube-api-access-hv9mf") pod "2a152944-4c08-47ee-bc41-90fa01d90bb1" (UID: "2a152944-4c08-47ee-bc41-90fa01d90bb1"). InnerVolumeSpecName "kube-api-access-hv9mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.138841 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "2a152944-4c08-47ee-bc41-90fa01d90bb1" (UID: "2a152944-4c08-47ee-bc41-90fa01d90bb1"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.155533 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "2a152944-4c08-47ee-bc41-90fa01d90bb1" (UID: "2a152944-4c08-47ee-bc41-90fa01d90bb1"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.163950 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2a152944-4c08-47ee-bc41-90fa01d90bb1" (UID: "2a152944-4c08-47ee-bc41-90fa01d90bb1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.169990 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "2a152944-4c08-47ee-bc41-90fa01d90bb1" (UID: "2a152944-4c08-47ee-bc41-90fa01d90bb1"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.173927 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "2a152944-4c08-47ee-bc41-90fa01d90bb1" (UID: "2a152944-4c08-47ee-bc41-90fa01d90bb1"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.185124 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "2a152944-4c08-47ee-bc41-90fa01d90bb1" (UID: "2a152944-4c08-47ee-bc41-90fa01d90bb1"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.185850 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-inventory" (OuterVolumeSpecName: "inventory") pod "2a152944-4c08-47ee-bc41-90fa01d90bb1" (UID: "2a152944-4c08-47ee-bc41-90fa01d90bb1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.199649 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "2a152944-4c08-47ee-bc41-90fa01d90bb1" (UID: "2a152944-4c08-47ee-bc41-90fa01d90bb1"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.228618 4693 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.228876 4693 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.228912 4693 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.228931 4693 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.228948 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.228963 4693 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.228978 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.228990 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv9mf\" (UniqueName: \"kubernetes.io/projected/2a152944-4c08-47ee-bc41-90fa01d90bb1-kube-api-access-hv9mf\") on node \"crc\" DevicePath \"\"" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.229003 4693 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2a152944-4c08-47ee-bc41-90fa01d90bb1-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.422104 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" event={"ID":"2a152944-4c08-47ee-bc41-90fa01d90bb1","Type":"ContainerDied","Data":"cbca9fbf89a3e515b0932b73cdbcdbc4ec27e4ed1d62fede6332cd957dff5cc1"} Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.422167 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbca9fbf89a3e515b0932b73cdbcdbc4ec27e4ed1d62fede6332cd957dff5cc1" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.422276 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-q4ddm" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.535242 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr"] Nov 25 12:53:24 crc kubenswrapper[4693]: E1125 12:53:24.536095 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a152944-4c08-47ee-bc41-90fa01d90bb1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.536123 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a152944-4c08-47ee-bc41-90fa01d90bb1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.536419 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a152944-4c08-47ee-bc41-90fa01d90bb1" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.537404 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.540333 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.540456 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lw9vv" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.541233 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.542690 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.549225 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.560810 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr"] Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.638460 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.638524 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.638561 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.638956 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.639114 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.639265 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.639498 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p65m\" (UniqueName: \"kubernetes.io/projected/ebbe9089-3f4f-46c6-a5ea-ff523e970069-kube-api-access-4p65m\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.740927 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.741029 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.741097 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.741186 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p65m\" (UniqueName: \"kubernetes.io/projected/ebbe9089-3f4f-46c6-a5ea-ff523e970069-kube-api-access-4p65m\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.741304 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.741336 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.741368 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.746421 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.747058 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.747284 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.747560 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.747794 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.748209 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.760338 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p65m\" (UniqueName: \"kubernetes.io/projected/ebbe9089-3f4f-46c6-a5ea-ff523e970069-kube-api-access-4p65m\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h89kr\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:24 crc kubenswrapper[4693]: I1125 12:53:24.864068 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:53:25 crc kubenswrapper[4693]: I1125 12:53:25.427843 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr"] Nov 25 12:53:25 crc kubenswrapper[4693]: I1125 12:53:25.434837 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 12:53:26 crc kubenswrapper[4693]: I1125 12:53:26.441739 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" event={"ID":"ebbe9089-3f4f-46c6-a5ea-ff523e970069","Type":"ContainerStarted","Data":"2536bcccaba71b2621e9f2255b2a345368bd89b549c2488c1961a52f2087a92b"} Nov 25 12:53:26 crc kubenswrapper[4693]: I1125 12:53:26.442547 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" event={"ID":"ebbe9089-3f4f-46c6-a5ea-ff523e970069","Type":"ContainerStarted","Data":"696bc352d32b32348d1f7a59b73c121456a05f90735cdef380f8dc8369488c27"} Nov 25 12:53:26 crc kubenswrapper[4693]: I1125 12:53:26.461686 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" podStartSLOduration=1.7896518179999998 podStartE2EDuration="2.461669311s" podCreationTimestamp="2025-11-25 12:53:24 +0000 UTC" firstStartedPulling="2025-11-25 12:53:25.434583751 +0000 UTC m=+2725.352669122" lastFinishedPulling="2025-11-25 12:53:26.106601234 +0000 UTC m=+2726.024686615" observedRunningTime="2025-11-25 12:53:26.459181333 +0000 UTC m=+2726.377266714" watchObservedRunningTime="2025-11-25 12:53:26.461669311 +0000 UTC m=+2726.379754692" Nov 25 12:53:35 crc kubenswrapper[4693]: I1125 12:53:35.113390 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:53:35 crc kubenswrapper[4693]: I1125 12:53:35.113925 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:54:05 crc kubenswrapper[4693]: I1125 12:54:05.113653 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:54:05 crc kubenswrapper[4693]: I1125 12:54:05.115584 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.636013 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ksm5j"] Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.638478 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.657138 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksm5j"] Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.742271 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c643e25a-283e-4f64-ba1f-a27dd94edfab-catalog-content\") pod \"redhat-marketplace-ksm5j\" (UID: \"c643e25a-283e-4f64-ba1f-a27dd94edfab\") " pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.742319 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2kjj\" (UniqueName: \"kubernetes.io/projected/c643e25a-283e-4f64-ba1f-a27dd94edfab-kube-api-access-p2kjj\") pod \"redhat-marketplace-ksm5j\" (UID: \"c643e25a-283e-4f64-ba1f-a27dd94edfab\") " pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.742491 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c643e25a-283e-4f64-ba1f-a27dd94edfab-utilities\") pod \"redhat-marketplace-ksm5j\" (UID: \"c643e25a-283e-4f64-ba1f-a27dd94edfab\") " pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.844736 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c643e25a-283e-4f64-ba1f-a27dd94edfab-catalog-content\") pod \"redhat-marketplace-ksm5j\" (UID: \"c643e25a-283e-4f64-ba1f-a27dd94edfab\") " pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.844796 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2kjj\" (UniqueName: \"kubernetes.io/projected/c643e25a-283e-4f64-ba1f-a27dd94edfab-kube-api-access-p2kjj\") pod \"redhat-marketplace-ksm5j\" (UID: \"c643e25a-283e-4f64-ba1f-a27dd94edfab\") " pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.844903 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c643e25a-283e-4f64-ba1f-a27dd94edfab-utilities\") pod \"redhat-marketplace-ksm5j\" (UID: \"c643e25a-283e-4f64-ba1f-a27dd94edfab\") " pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.845833 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c643e25a-283e-4f64-ba1f-a27dd94edfab-catalog-content\") pod \"redhat-marketplace-ksm5j\" (UID: \"c643e25a-283e-4f64-ba1f-a27dd94edfab\") " pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.846198 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c643e25a-283e-4f64-ba1f-a27dd94edfab-utilities\") pod \"redhat-marketplace-ksm5j\" (UID: \"c643e25a-283e-4f64-ba1f-a27dd94edfab\") " pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.871289 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2kjj\" (UniqueName: \"kubernetes.io/projected/c643e25a-283e-4f64-ba1f-a27dd94edfab-kube-api-access-p2kjj\") pod \"redhat-marketplace-ksm5j\" (UID: \"c643e25a-283e-4f64-ba1f-a27dd94edfab\") " pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:11 crc kubenswrapper[4693]: I1125 12:54:11.957333 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:12 crc kubenswrapper[4693]: I1125 12:54:12.444437 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksm5j"] Nov 25 12:54:12 crc kubenswrapper[4693]: I1125 12:54:12.937913 4693 generic.go:334] "Generic (PLEG): container finished" podID="c643e25a-283e-4f64-ba1f-a27dd94edfab" containerID="9b46c014b13123bfed24d80d9b244fd7879a2b6847869c7326eee2fdeecb046d" exitCode=0 Nov 25 12:54:12 crc kubenswrapper[4693]: I1125 12:54:12.938011 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksm5j" event={"ID":"c643e25a-283e-4f64-ba1f-a27dd94edfab","Type":"ContainerDied","Data":"9b46c014b13123bfed24d80d9b244fd7879a2b6847869c7326eee2fdeecb046d"} Nov 25 12:54:12 crc kubenswrapper[4693]: I1125 12:54:12.938503 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksm5j" event={"ID":"c643e25a-283e-4f64-ba1f-a27dd94edfab","Type":"ContainerStarted","Data":"f7d0bbb0aa0479727cfd96affd8c4e95b29787d63cb200996c5b212ef27174f1"} Nov 25 12:54:14 crc kubenswrapper[4693]: I1125 12:54:14.962566 4693 generic.go:334] "Generic (PLEG): container finished" podID="c643e25a-283e-4f64-ba1f-a27dd94edfab" containerID="dbb9a8484d370addeb85a40e1b94879ce2b24d16ef47accdb4881993e34d59d4" exitCode=0 Nov 25 12:54:14 crc kubenswrapper[4693]: I1125 12:54:14.962615 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksm5j" event={"ID":"c643e25a-283e-4f64-ba1f-a27dd94edfab","Type":"ContainerDied","Data":"dbb9a8484d370addeb85a40e1b94879ce2b24d16ef47accdb4881993e34d59d4"} Nov 25 12:54:16 crc kubenswrapper[4693]: I1125 12:54:16.998556 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksm5j" event={"ID":"c643e25a-283e-4f64-ba1f-a27dd94edfab","Type":"ContainerStarted","Data":"972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346"} Nov 25 12:54:17 crc kubenswrapper[4693]: I1125 12:54:17.027224 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ksm5j" podStartSLOduration=2.596458801 podStartE2EDuration="6.027203251s" podCreationTimestamp="2025-11-25 12:54:11 +0000 UTC" firstStartedPulling="2025-11-25 12:54:12.94261528 +0000 UTC m=+2772.860700671" lastFinishedPulling="2025-11-25 12:54:16.37335974 +0000 UTC m=+2776.291445121" observedRunningTime="2025-11-25 12:54:17.019745608 +0000 UTC m=+2776.937831009" watchObservedRunningTime="2025-11-25 12:54:17.027203251 +0000 UTC m=+2776.945288632" Nov 25 12:54:21 crc kubenswrapper[4693]: I1125 12:54:21.957625 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:21 crc kubenswrapper[4693]: I1125 12:54:21.958177 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:22 crc kubenswrapper[4693]: I1125 12:54:22.001518 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:22 crc kubenswrapper[4693]: I1125 12:54:22.103218 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:22 crc kubenswrapper[4693]: I1125 12:54:22.243245 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksm5j"] Nov 25 12:54:24 crc kubenswrapper[4693]: I1125 12:54:24.074349 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ksm5j" podUID="c643e25a-283e-4f64-ba1f-a27dd94edfab" containerName="registry-server" containerID="cri-o://972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346" gracePeriod=2 Nov 25 12:54:24 crc kubenswrapper[4693]: I1125 12:54:24.581904 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:24 crc kubenswrapper[4693]: I1125 12:54:24.695820 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c643e25a-283e-4f64-ba1f-a27dd94edfab-utilities\") pod \"c643e25a-283e-4f64-ba1f-a27dd94edfab\" (UID: \"c643e25a-283e-4f64-ba1f-a27dd94edfab\") " Nov 25 12:54:24 crc kubenswrapper[4693]: I1125 12:54:24.695935 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c643e25a-283e-4f64-ba1f-a27dd94edfab-catalog-content\") pod \"c643e25a-283e-4f64-ba1f-a27dd94edfab\" (UID: \"c643e25a-283e-4f64-ba1f-a27dd94edfab\") " Nov 25 12:54:24 crc kubenswrapper[4693]: I1125 12:54:24.696298 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2kjj\" (UniqueName: \"kubernetes.io/projected/c643e25a-283e-4f64-ba1f-a27dd94edfab-kube-api-access-p2kjj\") pod \"c643e25a-283e-4f64-ba1f-a27dd94edfab\" (UID: \"c643e25a-283e-4f64-ba1f-a27dd94edfab\") " Nov 25 12:54:24 crc kubenswrapper[4693]: I1125 12:54:24.697005 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c643e25a-283e-4f64-ba1f-a27dd94edfab-utilities" (OuterVolumeSpecName: "utilities") pod "c643e25a-283e-4f64-ba1f-a27dd94edfab" (UID: "c643e25a-283e-4f64-ba1f-a27dd94edfab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:54:24 crc kubenswrapper[4693]: I1125 12:54:24.704514 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c643e25a-283e-4f64-ba1f-a27dd94edfab-kube-api-access-p2kjj" (OuterVolumeSpecName: "kube-api-access-p2kjj") pod "c643e25a-283e-4f64-ba1f-a27dd94edfab" (UID: "c643e25a-283e-4f64-ba1f-a27dd94edfab"). InnerVolumeSpecName "kube-api-access-p2kjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:54:24 crc kubenswrapper[4693]: I1125 12:54:24.719914 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c643e25a-283e-4f64-ba1f-a27dd94edfab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c643e25a-283e-4f64-ba1f-a27dd94edfab" (UID: "c643e25a-283e-4f64-ba1f-a27dd94edfab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:54:24 crc kubenswrapper[4693]: I1125 12:54:24.798767 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2kjj\" (UniqueName: \"kubernetes.io/projected/c643e25a-283e-4f64-ba1f-a27dd94edfab-kube-api-access-p2kjj\") on node \"crc\" DevicePath \"\"" Nov 25 12:54:24 crc kubenswrapper[4693]: I1125 12:54:24.798806 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c643e25a-283e-4f64-ba1f-a27dd94edfab-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:54:24 crc kubenswrapper[4693]: I1125 12:54:24.798817 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c643e25a-283e-4f64-ba1f-a27dd94edfab-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.085092 4693 generic.go:334] "Generic (PLEG): container finished" podID="c643e25a-283e-4f64-ba1f-a27dd94edfab" containerID="972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346" exitCode=0 Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.085153 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksm5j" event={"ID":"c643e25a-283e-4f64-ba1f-a27dd94edfab","Type":"ContainerDied","Data":"972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346"} Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.085166 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ksm5j" Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.085240 4693 scope.go:117] "RemoveContainer" containerID="972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346" Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.085227 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksm5j" event={"ID":"c643e25a-283e-4f64-ba1f-a27dd94edfab","Type":"ContainerDied","Data":"f7d0bbb0aa0479727cfd96affd8c4e95b29787d63cb200996c5b212ef27174f1"} Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.111196 4693 scope.go:117] "RemoveContainer" containerID="dbb9a8484d370addeb85a40e1b94879ce2b24d16ef47accdb4881993e34d59d4" Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.116292 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksm5j"] Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.128834 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksm5j"] Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.132701 4693 scope.go:117] "RemoveContainer" containerID="9b46c014b13123bfed24d80d9b244fd7879a2b6847869c7326eee2fdeecb046d" Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.174198 4693 scope.go:117] "RemoveContainer" containerID="972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346" Nov 25 12:54:25 crc kubenswrapper[4693]: E1125 12:54:25.174678 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346\": container with ID starting with 972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346 not found: ID does not exist" containerID="972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346" Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.174731 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346"} err="failed to get container status \"972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346\": rpc error: code = NotFound desc = could not find container \"972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346\": container with ID starting with 972f5a2a2f79d6fef629e6fb7ffc49cc247d20ddbfabdcd01c6600091bfe7346 not found: ID does not exist" Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.174763 4693 scope.go:117] "RemoveContainer" containerID="dbb9a8484d370addeb85a40e1b94879ce2b24d16ef47accdb4881993e34d59d4" Nov 25 12:54:25 crc kubenswrapper[4693]: E1125 12:54:25.175213 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbb9a8484d370addeb85a40e1b94879ce2b24d16ef47accdb4881993e34d59d4\": container with ID starting with dbb9a8484d370addeb85a40e1b94879ce2b24d16ef47accdb4881993e34d59d4 not found: ID does not exist" containerID="dbb9a8484d370addeb85a40e1b94879ce2b24d16ef47accdb4881993e34d59d4" Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.175248 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb9a8484d370addeb85a40e1b94879ce2b24d16ef47accdb4881993e34d59d4"} err="failed to get container status \"dbb9a8484d370addeb85a40e1b94879ce2b24d16ef47accdb4881993e34d59d4\": rpc error: code = NotFound desc = could not find container \"dbb9a8484d370addeb85a40e1b94879ce2b24d16ef47accdb4881993e34d59d4\": container with ID starting with dbb9a8484d370addeb85a40e1b94879ce2b24d16ef47accdb4881993e34d59d4 not found: ID does not exist" Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.175270 4693 scope.go:117] "RemoveContainer" containerID="9b46c014b13123bfed24d80d9b244fd7879a2b6847869c7326eee2fdeecb046d" Nov 25 12:54:25 crc kubenswrapper[4693]: E1125 12:54:25.175623 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b46c014b13123bfed24d80d9b244fd7879a2b6847869c7326eee2fdeecb046d\": container with ID starting with 9b46c014b13123bfed24d80d9b244fd7879a2b6847869c7326eee2fdeecb046d not found: ID does not exist" containerID="9b46c014b13123bfed24d80d9b244fd7879a2b6847869c7326eee2fdeecb046d" Nov 25 12:54:25 crc kubenswrapper[4693]: I1125 12:54:25.175659 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b46c014b13123bfed24d80d9b244fd7879a2b6847869c7326eee2fdeecb046d"} err="failed to get container status \"9b46c014b13123bfed24d80d9b244fd7879a2b6847869c7326eee2fdeecb046d\": rpc error: code = NotFound desc = could not find container \"9b46c014b13123bfed24d80d9b244fd7879a2b6847869c7326eee2fdeecb046d\": container with ID starting with 9b46c014b13123bfed24d80d9b244fd7879a2b6847869c7326eee2fdeecb046d not found: ID does not exist" Nov 25 12:54:26 crc kubenswrapper[4693]: I1125 12:54:26.823051 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c643e25a-283e-4f64-ba1f-a27dd94edfab" path="/var/lib/kubelet/pods/c643e25a-283e-4f64-ba1f-a27dd94edfab/volumes" Nov 25 12:54:35 crc kubenswrapper[4693]: I1125 12:54:35.119796 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:54:35 crc kubenswrapper[4693]: I1125 12:54:35.120239 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:54:35 crc kubenswrapper[4693]: I1125 12:54:35.120281 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:54:35 crc kubenswrapper[4693]: I1125 12:54:35.121006 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4dd9adf1a9f2c9cf5ccaa1e03409a5005de06e132b88ee5a7daeb3646667496"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 12:54:35 crc kubenswrapper[4693]: I1125 12:54:35.121053 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://a4dd9adf1a9f2c9cf5ccaa1e03409a5005de06e132b88ee5a7daeb3646667496" gracePeriod=600 Nov 25 12:54:36 crc kubenswrapper[4693]: I1125 12:54:36.197209 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="a4dd9adf1a9f2c9cf5ccaa1e03409a5005de06e132b88ee5a7daeb3646667496" exitCode=0 Nov 25 12:54:36 crc kubenswrapper[4693]: I1125 12:54:36.197259 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"a4dd9adf1a9f2c9cf5ccaa1e03409a5005de06e132b88ee5a7daeb3646667496"} Nov 25 12:54:36 crc kubenswrapper[4693]: I1125 12:54:36.197904 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae"} Nov 25 12:54:36 crc kubenswrapper[4693]: I1125 12:54:36.197942 4693 scope.go:117] "RemoveContainer" containerID="829115945e83ec92f9a90936db210586607ed55b023b6d16bfd5fd06459c4bb0" Nov 25 12:56:01 crc kubenswrapper[4693]: I1125 12:56:01.015228 4693 generic.go:334] "Generic (PLEG): container finished" podID="ebbe9089-3f4f-46c6-a5ea-ff523e970069" containerID="2536bcccaba71b2621e9f2255b2a345368bd89b549c2488c1961a52f2087a92b" exitCode=0 Nov 25 12:56:01 crc kubenswrapper[4693]: I1125 12:56:01.015323 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" event={"ID":"ebbe9089-3f4f-46c6-a5ea-ff523e970069","Type":"ContainerDied","Data":"2536bcccaba71b2621e9f2255b2a345368bd89b549c2488c1961a52f2087a92b"} Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.511837 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.649636 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-inventory\") pod \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.649761 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ssh-key\") pod \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.649882 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p65m\" (UniqueName: \"kubernetes.io/projected/ebbe9089-3f4f-46c6-a5ea-ff523e970069-kube-api-access-4p65m\") pod \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.649923 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-1\") pod \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.649987 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-telemetry-combined-ca-bundle\") pod \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.650013 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-0\") pod \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.650059 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-2\") pod \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\" (UID: \"ebbe9089-3f4f-46c6-a5ea-ff523e970069\") " Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.658212 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ebbe9089-3f4f-46c6-a5ea-ff523e970069" (UID: "ebbe9089-3f4f-46c6-a5ea-ff523e970069"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.659984 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebbe9089-3f4f-46c6-a5ea-ff523e970069-kube-api-access-4p65m" (OuterVolumeSpecName: "kube-api-access-4p65m") pod "ebbe9089-3f4f-46c6-a5ea-ff523e970069" (UID: "ebbe9089-3f4f-46c6-a5ea-ff523e970069"). InnerVolumeSpecName "kube-api-access-4p65m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.684954 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "ebbe9089-3f4f-46c6-a5ea-ff523e970069" (UID: "ebbe9089-3f4f-46c6-a5ea-ff523e970069"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.686683 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ebbe9089-3f4f-46c6-a5ea-ff523e970069" (UID: "ebbe9089-3f4f-46c6-a5ea-ff523e970069"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.687416 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "ebbe9089-3f4f-46c6-a5ea-ff523e970069" (UID: "ebbe9089-3f4f-46c6-a5ea-ff523e970069"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.687946 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-inventory" (OuterVolumeSpecName: "inventory") pod "ebbe9089-3f4f-46c6-a5ea-ff523e970069" (UID: "ebbe9089-3f4f-46c6-a5ea-ff523e970069"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.689704 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "ebbe9089-3f4f-46c6-a5ea-ff523e970069" (UID: "ebbe9089-3f4f-46c6-a5ea-ff523e970069"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.753542 4693 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.753588 4693 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.753608 4693 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.753623 4693 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-inventory\") on node \"crc\" DevicePath \"\"" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.753637 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.753648 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p65m\" (UniqueName: \"kubernetes.io/projected/ebbe9089-3f4f-46c6-a5ea-ff523e970069-kube-api-access-4p65m\") on node \"crc\" DevicePath \"\"" Nov 25 12:56:02 crc kubenswrapper[4693]: I1125 12:56:02.753660 4693 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ebbe9089-3f4f-46c6-a5ea-ff523e970069-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 25 12:56:03 crc kubenswrapper[4693]: I1125 12:56:03.041180 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" event={"ID":"ebbe9089-3f4f-46c6-a5ea-ff523e970069","Type":"ContainerDied","Data":"696bc352d32b32348d1f7a59b73c121456a05f90735cdef380f8dc8369488c27"} Nov 25 12:56:03 crc kubenswrapper[4693]: I1125 12:56:03.041587 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="696bc352d32b32348d1f7a59b73c121456a05f90735cdef380f8dc8369488c27" Nov 25 12:56:03 crc kubenswrapper[4693]: I1125 12:56:03.041301 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h89kr" Nov 25 12:56:35 crc kubenswrapper[4693]: I1125 12:56:35.114113 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:56:35 crc kubenswrapper[4693]: I1125 12:56:35.115213 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.635511 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Nov 25 12:56:55 crc kubenswrapper[4693]: E1125 12:56:55.636458 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c643e25a-283e-4f64-ba1f-a27dd94edfab" containerName="extract-utilities" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.636475 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c643e25a-283e-4f64-ba1f-a27dd94edfab" containerName="extract-utilities" Nov 25 12:56:55 crc kubenswrapper[4693]: E1125 12:56:55.636489 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c643e25a-283e-4f64-ba1f-a27dd94edfab" containerName="extract-content" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.636496 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c643e25a-283e-4f64-ba1f-a27dd94edfab" containerName="extract-content" Nov 25 12:56:55 crc kubenswrapper[4693]: E1125 12:56:55.636514 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbe9089-3f4f-46c6-a5ea-ff523e970069" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.636521 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbe9089-3f4f-46c6-a5ea-ff523e970069" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 25 12:56:55 crc kubenswrapper[4693]: E1125 12:56:55.636539 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c643e25a-283e-4f64-ba1f-a27dd94edfab" containerName="registry-server" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.636545 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c643e25a-283e-4f64-ba1f-a27dd94edfab" containerName="registry-server" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.636758 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebbe9089-3f4f-46c6-a5ea-ff523e970069" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.636779 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c643e25a-283e-4f64-ba1f-a27dd94edfab" containerName="registry-server" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.637425 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.640637 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.640681 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.640917 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.641512 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qt8mw" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.646318 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.741132 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.741256 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.741338 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.741394 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.741430 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.741460 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-config-data\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.741500 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.741524 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxcfl\" (UniqueName: \"kubernetes.io/projected/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-kube-api-access-sxcfl\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.741550 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.843585 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.843674 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.843731 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.843771 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-config-data\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.843848 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.843899 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxcfl\" (UniqueName: \"kubernetes.io/projected/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-kube-api-access-sxcfl\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.844307 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.844528 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.844606 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.844795 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.844806 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.845179 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.845250 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.845498 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-config-data\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.851285 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.853167 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.855319 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.864239 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxcfl\" (UniqueName: \"kubernetes.io/projected/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-kube-api-access-sxcfl\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.895340 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " pod="openstack/tempest-tests-tempest" Nov 25 12:56:55 crc kubenswrapper[4693]: I1125 12:56:55.956952 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 25 12:56:56 crc kubenswrapper[4693]: W1125 12:56:56.419919 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b66ebc6_a0f0_4418_8d28_7364b1f5d177.slice/crio-e4791e8e84ec354735e0ece1b0e43d9cfa2854f79497283aa031b7ca7e35b67f WatchSource:0}: Error finding container e4791e8e84ec354735e0ece1b0e43d9cfa2854f79497283aa031b7ca7e35b67f: Status 404 returned error can't find the container with id e4791e8e84ec354735e0ece1b0e43d9cfa2854f79497283aa031b7ca7e35b67f Nov 25 12:56:56 crc kubenswrapper[4693]: I1125 12:56:56.421348 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 25 12:56:56 crc kubenswrapper[4693]: I1125 12:56:56.551531 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9b66ebc6-a0f0-4418-8d28-7364b1f5d177","Type":"ContainerStarted","Data":"e4791e8e84ec354735e0ece1b0e43d9cfa2854f79497283aa031b7ca7e35b67f"} Nov 25 12:57:05 crc kubenswrapper[4693]: I1125 12:57:05.113760 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:57:05 crc kubenswrapper[4693]: I1125 12:57:05.114396 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:57:21 crc kubenswrapper[4693]: E1125 12:57:21.949620 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Nov 25 12:57:21 crc kubenswrapper[4693]: E1125 12:57:21.950505 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sxcfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(9b66ebc6-a0f0-4418-8d28-7364b1f5d177): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 12:57:21 crc kubenswrapper[4693]: E1125 12:57:21.951725 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="9b66ebc6-a0f0-4418-8d28-7364b1f5d177" Nov 25 12:57:22 crc kubenswrapper[4693]: E1125 12:57:22.805522 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="9b66ebc6-a0f0-4418-8d28-7364b1f5d177" Nov 25 12:57:35 crc kubenswrapper[4693]: I1125 12:57:35.113831 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 12:57:35 crc kubenswrapper[4693]: I1125 12:57:35.114430 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 12:57:35 crc kubenswrapper[4693]: I1125 12:57:35.114494 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 12:57:35 crc kubenswrapper[4693]: I1125 12:57:35.115225 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 12:57:35 crc kubenswrapper[4693]: I1125 12:57:35.115295 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" gracePeriod=600 Nov 25 12:57:35 crc kubenswrapper[4693]: E1125 12:57:35.244181 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:57:35 crc kubenswrapper[4693]: I1125 12:57:35.959242 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" exitCode=0 Nov 25 12:57:35 crc kubenswrapper[4693]: I1125 12:57:35.959320 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae"} Nov 25 12:57:35 crc kubenswrapper[4693]: I1125 12:57:35.959719 4693 scope.go:117] "RemoveContainer" containerID="a4dd9adf1a9f2c9cf5ccaa1e03409a5005de06e132b88ee5a7daeb3646667496" Nov 25 12:57:35 crc kubenswrapper[4693]: I1125 12:57:35.960320 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 12:57:35 crc kubenswrapper[4693]: E1125 12:57:35.960679 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:57:36 crc kubenswrapper[4693]: I1125 12:57:36.241714 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 25 12:57:37 crc kubenswrapper[4693]: I1125 12:57:37.984472 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9b66ebc6-a0f0-4418-8d28-7364b1f5d177","Type":"ContainerStarted","Data":"42d44e4c360fd562bc80663098d25332a1f176691a4a88b1f0ffd0871fe51fd9"} Nov 25 12:57:38 crc kubenswrapper[4693]: I1125 12:57:38.015507 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.199645297 podStartE2EDuration="44.015488796s" podCreationTimestamp="2025-11-25 12:56:54 +0000 UTC" firstStartedPulling="2025-11-25 12:56:56.42288922 +0000 UTC m=+2936.340974601" lastFinishedPulling="2025-11-25 12:57:36.238732719 +0000 UTC m=+2976.156818100" observedRunningTime="2025-11-25 12:57:38.006537344 +0000 UTC m=+2977.924622745" watchObservedRunningTime="2025-11-25 12:57:38.015488796 +0000 UTC m=+2977.933574177" Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.630579 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zgx65"] Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.635874 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.647872 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zgx65"] Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.792090 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwslv\" (UniqueName: \"kubernetes.io/projected/5e290613-27fe-41ce-a9f9-125fb9a88d95-kube-api-access-jwslv\") pod \"certified-operators-zgx65\" (UID: \"5e290613-27fe-41ce-a9f9-125fb9a88d95\") " pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.792530 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e290613-27fe-41ce-a9f9-125fb9a88d95-utilities\") pod \"certified-operators-zgx65\" (UID: \"5e290613-27fe-41ce-a9f9-125fb9a88d95\") " pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.792612 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e290613-27fe-41ce-a9f9-125fb9a88d95-catalog-content\") pod \"certified-operators-zgx65\" (UID: \"5e290613-27fe-41ce-a9f9-125fb9a88d95\") " pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.894608 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e290613-27fe-41ce-a9f9-125fb9a88d95-catalog-content\") pod \"certified-operators-zgx65\" (UID: \"5e290613-27fe-41ce-a9f9-125fb9a88d95\") " pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.894774 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwslv\" (UniqueName: \"kubernetes.io/projected/5e290613-27fe-41ce-a9f9-125fb9a88d95-kube-api-access-jwslv\") pod \"certified-operators-zgx65\" (UID: \"5e290613-27fe-41ce-a9f9-125fb9a88d95\") " pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.894870 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e290613-27fe-41ce-a9f9-125fb9a88d95-utilities\") pod \"certified-operators-zgx65\" (UID: \"5e290613-27fe-41ce-a9f9-125fb9a88d95\") " pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.895351 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e290613-27fe-41ce-a9f9-125fb9a88d95-catalog-content\") pod \"certified-operators-zgx65\" (UID: \"5e290613-27fe-41ce-a9f9-125fb9a88d95\") " pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.895398 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e290613-27fe-41ce-a9f9-125fb9a88d95-utilities\") pod \"certified-operators-zgx65\" (UID: \"5e290613-27fe-41ce-a9f9-125fb9a88d95\") " pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.918182 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwslv\" (UniqueName: \"kubernetes.io/projected/5e290613-27fe-41ce-a9f9-125fb9a88d95-kube-api-access-jwslv\") pod \"certified-operators-zgx65\" (UID: \"5e290613-27fe-41ce-a9f9-125fb9a88d95\") " pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:43 crc kubenswrapper[4693]: I1125 12:57:43.977032 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:44 crc kubenswrapper[4693]: I1125 12:57:44.521904 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zgx65"] Nov 25 12:57:45 crc kubenswrapper[4693]: I1125 12:57:45.065657 4693 generic.go:334] "Generic (PLEG): container finished" podID="5e290613-27fe-41ce-a9f9-125fb9a88d95" containerID="d03620b60649abf0a2cf3ea20803e86f991fd5c3c6a6ceb3efbd75c1466254ec" exitCode=0 Nov 25 12:57:45 crc kubenswrapper[4693]: I1125 12:57:45.065798 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgx65" event={"ID":"5e290613-27fe-41ce-a9f9-125fb9a88d95","Type":"ContainerDied","Data":"d03620b60649abf0a2cf3ea20803e86f991fd5c3c6a6ceb3efbd75c1466254ec"} Nov 25 12:57:45 crc kubenswrapper[4693]: I1125 12:57:45.065942 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgx65" event={"ID":"5e290613-27fe-41ce-a9f9-125fb9a88d95","Type":"ContainerStarted","Data":"fce0244fa55b629d48d7164e9881f0cc9d97773563d95782836d79300e56feb1"} Nov 25 12:57:46 crc kubenswrapper[4693]: I1125 12:57:46.077301 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgx65" event={"ID":"5e290613-27fe-41ce-a9f9-125fb9a88d95","Type":"ContainerStarted","Data":"2e3c02c2ca358042cbe3fd9e724333e2f06a5d10f03cbf0f4d826d1c342b861f"} Nov 25 12:57:47 crc kubenswrapper[4693]: I1125 12:57:47.092298 4693 generic.go:334] "Generic (PLEG): container finished" podID="5e290613-27fe-41ce-a9f9-125fb9a88d95" containerID="2e3c02c2ca358042cbe3fd9e724333e2f06a5d10f03cbf0f4d826d1c342b861f" exitCode=0 Nov 25 12:57:47 crc kubenswrapper[4693]: I1125 12:57:47.092595 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgx65" event={"ID":"5e290613-27fe-41ce-a9f9-125fb9a88d95","Type":"ContainerDied","Data":"2e3c02c2ca358042cbe3fd9e724333e2f06a5d10f03cbf0f4d826d1c342b861f"} Nov 25 12:57:47 crc kubenswrapper[4693]: I1125 12:57:47.813357 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 12:57:47 crc kubenswrapper[4693]: E1125 12:57:47.814079 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:57:47 crc kubenswrapper[4693]: I1125 12:57:47.990693 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nljbp"] Nov 25 12:57:47 crc kubenswrapper[4693]: I1125 12:57:47.993096 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.013913 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nljbp"] Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.079330 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-catalog-content\") pod \"community-operators-nljbp\" (UID: \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\") " pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.079571 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-utilities\") pod \"community-operators-nljbp\" (UID: \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\") " pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.079769 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7xc2\" (UniqueName: \"kubernetes.io/projected/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-kube-api-access-c7xc2\") pod \"community-operators-nljbp\" (UID: \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\") " pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.102567 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgx65" event={"ID":"5e290613-27fe-41ce-a9f9-125fb9a88d95","Type":"ContainerStarted","Data":"6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c"} Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.123965 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zgx65" podStartSLOduration=2.690324246 podStartE2EDuration="5.123947096s" podCreationTimestamp="2025-11-25 12:57:43 +0000 UTC" firstStartedPulling="2025-11-25 12:57:45.067615758 +0000 UTC m=+2984.985701129" lastFinishedPulling="2025-11-25 12:57:47.501238588 +0000 UTC m=+2987.419323979" observedRunningTime="2025-11-25 12:57:48.122491327 +0000 UTC m=+2988.040576778" watchObservedRunningTime="2025-11-25 12:57:48.123947096 +0000 UTC m=+2988.042032477" Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.182167 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-catalog-content\") pod \"community-operators-nljbp\" (UID: \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\") " pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.182298 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-utilities\") pod \"community-operators-nljbp\" (UID: \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\") " pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.182427 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7xc2\" (UniqueName: \"kubernetes.io/projected/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-kube-api-access-c7xc2\") pod \"community-operators-nljbp\" (UID: \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\") " pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.182779 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-catalog-content\") pod \"community-operators-nljbp\" (UID: \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\") " pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.183048 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-utilities\") pod \"community-operators-nljbp\" (UID: \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\") " pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.204768 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7xc2\" (UniqueName: \"kubernetes.io/projected/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-kube-api-access-c7xc2\") pod \"community-operators-nljbp\" (UID: \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\") " pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.311736 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:48 crc kubenswrapper[4693]: I1125 12:57:48.906993 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nljbp"] Nov 25 12:57:48 crc kubenswrapper[4693]: W1125 12:57:48.921649 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc105f651_b8c6_4f4e_ae56_4a5e2131c85d.slice/crio-592e6975792137678c872fe1419c7af073c4585399f25d25388f13f98bc32618 WatchSource:0}: Error finding container 592e6975792137678c872fe1419c7af073c4585399f25d25388f13f98bc32618: Status 404 returned error can't find the container with id 592e6975792137678c872fe1419c7af073c4585399f25d25388f13f98bc32618 Nov 25 12:57:49 crc kubenswrapper[4693]: I1125 12:57:49.113061 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nljbp" event={"ID":"c105f651-b8c6-4f4e-ae56-4a5e2131c85d","Type":"ContainerStarted","Data":"592e6975792137678c872fe1419c7af073c4585399f25d25388f13f98bc32618"} Nov 25 12:57:50 crc kubenswrapper[4693]: I1125 12:57:50.122772 4693 generic.go:334] "Generic (PLEG): container finished" podID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" containerID="7d841010f3ddb11dbfaccea479a4e9d21699d43d263f66c1598a11cb09611bcb" exitCode=0 Nov 25 12:57:50 crc kubenswrapper[4693]: I1125 12:57:50.122825 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nljbp" event={"ID":"c105f651-b8c6-4f4e-ae56-4a5e2131c85d","Type":"ContainerDied","Data":"7d841010f3ddb11dbfaccea479a4e9d21699d43d263f66c1598a11cb09611bcb"} Nov 25 12:57:51 crc kubenswrapper[4693]: I1125 12:57:51.142902 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nljbp" event={"ID":"c105f651-b8c6-4f4e-ae56-4a5e2131c85d","Type":"ContainerStarted","Data":"691f457aba8d7a1842a35e9a4dbd76b716ce49c16018367306d93caf2f719f6b"} Nov 25 12:57:52 crc kubenswrapper[4693]: I1125 12:57:52.154749 4693 generic.go:334] "Generic (PLEG): container finished" podID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" containerID="691f457aba8d7a1842a35e9a4dbd76b716ce49c16018367306d93caf2f719f6b" exitCode=0 Nov 25 12:57:52 crc kubenswrapper[4693]: I1125 12:57:52.154935 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nljbp" event={"ID":"c105f651-b8c6-4f4e-ae56-4a5e2131c85d","Type":"ContainerDied","Data":"691f457aba8d7a1842a35e9a4dbd76b716ce49c16018367306d93caf2f719f6b"} Nov 25 12:57:53 crc kubenswrapper[4693]: I1125 12:57:53.172213 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nljbp" event={"ID":"c105f651-b8c6-4f4e-ae56-4a5e2131c85d","Type":"ContainerStarted","Data":"474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62"} Nov 25 12:57:53 crc kubenswrapper[4693]: I1125 12:57:53.194711 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nljbp" podStartSLOduration=3.449785084 podStartE2EDuration="6.194695217s" podCreationTimestamp="2025-11-25 12:57:47 +0000 UTC" firstStartedPulling="2025-11-25 12:57:50.126888489 +0000 UTC m=+2990.044973870" lastFinishedPulling="2025-11-25 12:57:52.871798622 +0000 UTC m=+2992.789884003" observedRunningTime="2025-11-25 12:57:53.190916305 +0000 UTC m=+2993.109001686" watchObservedRunningTime="2025-11-25 12:57:53.194695217 +0000 UTC m=+2993.112780598" Nov 25 12:57:53 crc kubenswrapper[4693]: I1125 12:57:53.977999 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:53 crc kubenswrapper[4693]: I1125 12:57:53.978488 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:54 crc kubenswrapper[4693]: I1125 12:57:54.026918 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:54 crc kubenswrapper[4693]: I1125 12:57:54.230838 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:56 crc kubenswrapper[4693]: I1125 12:57:56.185287 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zgx65"] Nov 25 12:57:56 crc kubenswrapper[4693]: I1125 12:57:56.203898 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zgx65" podUID="5e290613-27fe-41ce-a9f9-125fb9a88d95" containerName="registry-server" containerID="cri-o://6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c" gracePeriod=2 Nov 25 12:57:56 crc kubenswrapper[4693]: E1125 12:57:56.381678 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e290613_27fe_41ce_a9f9_125fb9a88d95.slice/crio-6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e290613_27fe_41ce_a9f9_125fb9a88d95.slice/crio-conmon-6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c.scope\": RecentStats: unable to find data in memory cache]" Nov 25 12:57:56 crc kubenswrapper[4693]: I1125 12:57:56.714907 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:56 crc kubenswrapper[4693]: I1125 12:57:56.879718 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e290613-27fe-41ce-a9f9-125fb9a88d95-utilities\") pod \"5e290613-27fe-41ce-a9f9-125fb9a88d95\" (UID: \"5e290613-27fe-41ce-a9f9-125fb9a88d95\") " Nov 25 12:57:56 crc kubenswrapper[4693]: I1125 12:57:56.879834 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwslv\" (UniqueName: \"kubernetes.io/projected/5e290613-27fe-41ce-a9f9-125fb9a88d95-kube-api-access-jwslv\") pod \"5e290613-27fe-41ce-a9f9-125fb9a88d95\" (UID: \"5e290613-27fe-41ce-a9f9-125fb9a88d95\") " Nov 25 12:57:56 crc kubenswrapper[4693]: I1125 12:57:56.879955 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e290613-27fe-41ce-a9f9-125fb9a88d95-catalog-content\") pod \"5e290613-27fe-41ce-a9f9-125fb9a88d95\" (UID: \"5e290613-27fe-41ce-a9f9-125fb9a88d95\") " Nov 25 12:57:56 crc kubenswrapper[4693]: I1125 12:57:56.881680 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e290613-27fe-41ce-a9f9-125fb9a88d95-utilities" (OuterVolumeSpecName: "utilities") pod "5e290613-27fe-41ce-a9f9-125fb9a88d95" (UID: "5e290613-27fe-41ce-a9f9-125fb9a88d95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:57:56 crc kubenswrapper[4693]: I1125 12:57:56.887996 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e290613-27fe-41ce-a9f9-125fb9a88d95-kube-api-access-jwslv" (OuterVolumeSpecName: "kube-api-access-jwslv") pod "5e290613-27fe-41ce-a9f9-125fb9a88d95" (UID: "5e290613-27fe-41ce-a9f9-125fb9a88d95"). InnerVolumeSpecName "kube-api-access-jwslv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:57:56 crc kubenswrapper[4693]: I1125 12:57:56.940919 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e290613-27fe-41ce-a9f9-125fb9a88d95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e290613-27fe-41ce-a9f9-125fb9a88d95" (UID: "5e290613-27fe-41ce-a9f9-125fb9a88d95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:57:56 crc kubenswrapper[4693]: I1125 12:57:56.982612 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e290613-27fe-41ce-a9f9-125fb9a88d95-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:57:56 crc kubenswrapper[4693]: I1125 12:57:56.982650 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwslv\" (UniqueName: \"kubernetes.io/projected/5e290613-27fe-41ce-a9f9-125fb9a88d95-kube-api-access-jwslv\") on node \"crc\" DevicePath \"\"" Nov 25 12:57:56 crc kubenswrapper[4693]: I1125 12:57:56.982660 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e290613-27fe-41ce-a9f9-125fb9a88d95-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.220334 4693 generic.go:334] "Generic (PLEG): container finished" podID="5e290613-27fe-41ce-a9f9-125fb9a88d95" containerID="6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c" exitCode=0 Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.220415 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgx65" event={"ID":"5e290613-27fe-41ce-a9f9-125fb9a88d95","Type":"ContainerDied","Data":"6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c"} Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.220446 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgx65" event={"ID":"5e290613-27fe-41ce-a9f9-125fb9a88d95","Type":"ContainerDied","Data":"fce0244fa55b629d48d7164e9881f0cc9d97773563d95782836d79300e56feb1"} Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.220467 4693 scope.go:117] "RemoveContainer" containerID="6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c" Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.220598 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgx65" Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.278896 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zgx65"] Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.289822 4693 scope.go:117] "RemoveContainer" containerID="2e3c02c2ca358042cbe3fd9e724333e2f06a5d10f03cbf0f4d826d1c342b861f" Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.290805 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zgx65"] Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.324677 4693 scope.go:117] "RemoveContainer" containerID="d03620b60649abf0a2cf3ea20803e86f991fd5c3c6a6ceb3efbd75c1466254ec" Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.382341 4693 scope.go:117] "RemoveContainer" containerID="6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c" Nov 25 12:57:57 crc kubenswrapper[4693]: E1125 12:57:57.382888 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c\": container with ID starting with 6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c not found: ID does not exist" containerID="6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c" Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.382947 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c"} err="failed to get container status \"6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c\": rpc error: code = NotFound desc = could not find container \"6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c\": container with ID starting with 6f240768679bcb7700b4edec55e0886e772701c39fd7c6237d15ca90bd66f00c not found: ID does not exist" Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.382980 4693 scope.go:117] "RemoveContainer" containerID="2e3c02c2ca358042cbe3fd9e724333e2f06a5d10f03cbf0f4d826d1c342b861f" Nov 25 12:57:57 crc kubenswrapper[4693]: E1125 12:57:57.383527 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e3c02c2ca358042cbe3fd9e724333e2f06a5d10f03cbf0f4d826d1c342b861f\": container with ID starting with 2e3c02c2ca358042cbe3fd9e724333e2f06a5d10f03cbf0f4d826d1c342b861f not found: ID does not exist" containerID="2e3c02c2ca358042cbe3fd9e724333e2f06a5d10f03cbf0f4d826d1c342b861f" Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.383569 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3c02c2ca358042cbe3fd9e724333e2f06a5d10f03cbf0f4d826d1c342b861f"} err="failed to get container status \"2e3c02c2ca358042cbe3fd9e724333e2f06a5d10f03cbf0f4d826d1c342b861f\": rpc error: code = NotFound desc = could not find container \"2e3c02c2ca358042cbe3fd9e724333e2f06a5d10f03cbf0f4d826d1c342b861f\": container with ID starting with 2e3c02c2ca358042cbe3fd9e724333e2f06a5d10f03cbf0f4d826d1c342b861f not found: ID does not exist" Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.383600 4693 scope.go:117] "RemoveContainer" containerID="d03620b60649abf0a2cf3ea20803e86f991fd5c3c6a6ceb3efbd75c1466254ec" Nov 25 12:57:57 crc kubenswrapper[4693]: E1125 12:57:57.383938 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d03620b60649abf0a2cf3ea20803e86f991fd5c3c6a6ceb3efbd75c1466254ec\": container with ID starting with d03620b60649abf0a2cf3ea20803e86f991fd5c3c6a6ceb3efbd75c1466254ec not found: ID does not exist" containerID="d03620b60649abf0a2cf3ea20803e86f991fd5c3c6a6ceb3efbd75c1466254ec" Nov 25 12:57:57 crc kubenswrapper[4693]: I1125 12:57:57.383972 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03620b60649abf0a2cf3ea20803e86f991fd5c3c6a6ceb3efbd75c1466254ec"} err="failed to get container status \"d03620b60649abf0a2cf3ea20803e86f991fd5c3c6a6ceb3efbd75c1466254ec\": rpc error: code = NotFound desc = could not find container \"d03620b60649abf0a2cf3ea20803e86f991fd5c3c6a6ceb3efbd75c1466254ec\": container with ID starting with d03620b60649abf0a2cf3ea20803e86f991fd5c3c6a6ceb3efbd75c1466254ec not found: ID does not exist" Nov 25 12:57:58 crc kubenswrapper[4693]: I1125 12:57:58.312481 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:58 crc kubenswrapper[4693]: I1125 12:57:58.313541 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:57:58 crc kubenswrapper[4693]: I1125 12:57:58.825357 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e290613-27fe-41ce-a9f9-125fb9a88d95" path="/var/lib/kubelet/pods/5e290613-27fe-41ce-a9f9-125fb9a88d95/volumes" Nov 25 12:57:59 crc kubenswrapper[4693]: I1125 12:57:59.385167 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-nljbp" podUID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" containerName="registry-server" probeResult="failure" output=< Nov 25 12:57:59 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 12:57:59 crc kubenswrapper[4693]: > Nov 25 12:58:02 crc kubenswrapper[4693]: I1125 12:58:02.813619 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 12:58:02 crc kubenswrapper[4693]: E1125 12:58:02.814618 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:58:08 crc kubenswrapper[4693]: I1125 12:58:08.381951 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:58:08 crc kubenswrapper[4693]: I1125 12:58:08.436990 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:58:08 crc kubenswrapper[4693]: I1125 12:58:08.620612 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nljbp"] Nov 25 12:58:10 crc kubenswrapper[4693]: I1125 12:58:10.370913 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nljbp" podUID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" containerName="registry-server" containerID="cri-o://474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62" gracePeriod=2 Nov 25 12:58:10 crc kubenswrapper[4693]: I1125 12:58:10.889433 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:58:10 crc kubenswrapper[4693]: I1125 12:58:10.983580 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-utilities\") pod \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\" (UID: \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\") " Nov 25 12:58:10 crc kubenswrapper[4693]: I1125 12:58:10.983766 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7xc2\" (UniqueName: \"kubernetes.io/projected/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-kube-api-access-c7xc2\") pod \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\" (UID: \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\") " Nov 25 12:58:10 crc kubenswrapper[4693]: I1125 12:58:10.983981 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-catalog-content\") pod \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\" (UID: \"c105f651-b8c6-4f4e-ae56-4a5e2131c85d\") " Nov 25 12:58:10 crc kubenswrapper[4693]: I1125 12:58:10.984307 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-utilities" (OuterVolumeSpecName: "utilities") pod "c105f651-b8c6-4f4e-ae56-4a5e2131c85d" (UID: "c105f651-b8c6-4f4e-ae56-4a5e2131c85d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:58:10 crc kubenswrapper[4693]: I1125 12:58:10.985145 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 12:58:10 crc kubenswrapper[4693]: I1125 12:58:10.996573 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-kube-api-access-c7xc2" (OuterVolumeSpecName: "kube-api-access-c7xc2") pod "c105f651-b8c6-4f4e-ae56-4a5e2131c85d" (UID: "c105f651-b8c6-4f4e-ae56-4a5e2131c85d"). InnerVolumeSpecName "kube-api-access-c7xc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.033994 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c105f651-b8c6-4f4e-ae56-4a5e2131c85d" (UID: "c105f651-b8c6-4f4e-ae56-4a5e2131c85d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.087261 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7xc2\" (UniqueName: \"kubernetes.io/projected/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-kube-api-access-c7xc2\") on node \"crc\" DevicePath \"\"" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.087296 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c105f651-b8c6-4f4e-ae56-4a5e2131c85d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.385256 4693 generic.go:334] "Generic (PLEG): container finished" podID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" containerID="474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62" exitCode=0 Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.385317 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nljbp" event={"ID":"c105f651-b8c6-4f4e-ae56-4a5e2131c85d","Type":"ContainerDied","Data":"474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62"} Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.385353 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nljbp" event={"ID":"c105f651-b8c6-4f4e-ae56-4a5e2131c85d","Type":"ContainerDied","Data":"592e6975792137678c872fe1419c7af073c4585399f25d25388f13f98bc32618"} Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.385405 4693 scope.go:117] "RemoveContainer" containerID="474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.385627 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nljbp" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.430028 4693 scope.go:117] "RemoveContainer" containerID="691f457aba8d7a1842a35e9a4dbd76b716ce49c16018367306d93caf2f719f6b" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.449891 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nljbp"] Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.463107 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nljbp"] Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.473644 4693 scope.go:117] "RemoveContainer" containerID="7d841010f3ddb11dbfaccea479a4e9d21699d43d263f66c1598a11cb09611bcb" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.539504 4693 scope.go:117] "RemoveContainer" containerID="474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62" Nov 25 12:58:11 crc kubenswrapper[4693]: E1125 12:58:11.540099 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62\": container with ID starting with 474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62 not found: ID does not exist" containerID="474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.540171 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62"} err="failed to get container status \"474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62\": rpc error: code = NotFound desc = could not find container \"474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62\": container with ID starting with 474e5fd6491a37a97480b287bc80c382cfebfef9ea597f0bb7eb710236eb6d62 not found: ID does not exist" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.540210 4693 scope.go:117] "RemoveContainer" containerID="691f457aba8d7a1842a35e9a4dbd76b716ce49c16018367306d93caf2f719f6b" Nov 25 12:58:11 crc kubenswrapper[4693]: E1125 12:58:11.540727 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"691f457aba8d7a1842a35e9a4dbd76b716ce49c16018367306d93caf2f719f6b\": container with ID starting with 691f457aba8d7a1842a35e9a4dbd76b716ce49c16018367306d93caf2f719f6b not found: ID does not exist" containerID="691f457aba8d7a1842a35e9a4dbd76b716ce49c16018367306d93caf2f719f6b" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.540824 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"691f457aba8d7a1842a35e9a4dbd76b716ce49c16018367306d93caf2f719f6b"} err="failed to get container status \"691f457aba8d7a1842a35e9a4dbd76b716ce49c16018367306d93caf2f719f6b\": rpc error: code = NotFound desc = could not find container \"691f457aba8d7a1842a35e9a4dbd76b716ce49c16018367306d93caf2f719f6b\": container with ID starting with 691f457aba8d7a1842a35e9a4dbd76b716ce49c16018367306d93caf2f719f6b not found: ID does not exist" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.540899 4693 scope.go:117] "RemoveContainer" containerID="7d841010f3ddb11dbfaccea479a4e9d21699d43d263f66c1598a11cb09611bcb" Nov 25 12:58:11 crc kubenswrapper[4693]: E1125 12:58:11.541258 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d841010f3ddb11dbfaccea479a4e9d21699d43d263f66c1598a11cb09611bcb\": container with ID starting with 7d841010f3ddb11dbfaccea479a4e9d21699d43d263f66c1598a11cb09611bcb not found: ID does not exist" containerID="7d841010f3ddb11dbfaccea479a4e9d21699d43d263f66c1598a11cb09611bcb" Nov 25 12:58:11 crc kubenswrapper[4693]: I1125 12:58:11.541306 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d841010f3ddb11dbfaccea479a4e9d21699d43d263f66c1598a11cb09611bcb"} err="failed to get container status \"7d841010f3ddb11dbfaccea479a4e9d21699d43d263f66c1598a11cb09611bcb\": rpc error: code = NotFound desc = could not find container \"7d841010f3ddb11dbfaccea479a4e9d21699d43d263f66c1598a11cb09611bcb\": container with ID starting with 7d841010f3ddb11dbfaccea479a4e9d21699d43d263f66c1598a11cb09611bcb not found: ID does not exist" Nov 25 12:58:12 crc kubenswrapper[4693]: I1125 12:58:12.842186 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" path="/var/lib/kubelet/pods/c105f651-b8c6-4f4e-ae56-4a5e2131c85d/volumes" Nov 25 12:58:17 crc kubenswrapper[4693]: I1125 12:58:17.812926 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 12:58:17 crc kubenswrapper[4693]: E1125 12:58:17.813681 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:58:32 crc kubenswrapper[4693]: I1125 12:58:32.813856 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 12:58:32 crc kubenswrapper[4693]: E1125 12:58:32.816394 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:58:45 crc kubenswrapper[4693]: I1125 12:58:45.813278 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 12:58:45 crc kubenswrapper[4693]: E1125 12:58:45.814505 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:58:59 crc kubenswrapper[4693]: I1125 12:58:59.812719 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 12:58:59 crc kubenswrapper[4693]: E1125 12:58:59.813585 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:59:13 crc kubenswrapper[4693]: I1125 12:59:13.812533 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 12:59:13 crc kubenswrapper[4693]: E1125 12:59:13.813250 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:59:25 crc kubenswrapper[4693]: I1125 12:59:25.813427 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 12:59:25 crc kubenswrapper[4693]: E1125 12:59:25.814325 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:59:38 crc kubenswrapper[4693]: I1125 12:59:38.813199 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 12:59:38 crc kubenswrapper[4693]: E1125 12:59:38.814157 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 12:59:50 crc kubenswrapper[4693]: I1125 12:59:50.825298 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 12:59:50 crc kubenswrapper[4693]: E1125 12:59:50.826331 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.144361 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7"] Nov 25 13:00:00 crc kubenswrapper[4693]: E1125 13:00:00.145167 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e290613-27fe-41ce-a9f9-125fb9a88d95" containerName="extract-utilities" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.145180 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e290613-27fe-41ce-a9f9-125fb9a88d95" containerName="extract-utilities" Nov 25 13:00:00 crc kubenswrapper[4693]: E1125 13:00:00.145197 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e290613-27fe-41ce-a9f9-125fb9a88d95" containerName="extract-content" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.145205 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e290613-27fe-41ce-a9f9-125fb9a88d95" containerName="extract-content" Nov 25 13:00:00 crc kubenswrapper[4693]: E1125 13:00:00.145225 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e290613-27fe-41ce-a9f9-125fb9a88d95" containerName="registry-server" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.145231 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e290613-27fe-41ce-a9f9-125fb9a88d95" containerName="registry-server" Nov 25 13:00:00 crc kubenswrapper[4693]: E1125 13:00:00.145246 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" containerName="extract-content" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.145252 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" containerName="extract-content" Nov 25 13:00:00 crc kubenswrapper[4693]: E1125 13:00:00.145264 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" containerName="extract-utilities" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.145270 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" containerName="extract-utilities" Nov 25 13:00:00 crc kubenswrapper[4693]: E1125 13:00:00.145283 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" containerName="registry-server" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.145288 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" containerName="registry-server" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.145494 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c105f651-b8c6-4f4e-ae56-4a5e2131c85d" containerName="registry-server" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.145506 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e290613-27fe-41ce-a9f9-125fb9a88d95" containerName="registry-server" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.146116 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.148132 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.148169 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.158254 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7"] Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.274887 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e05e6b7-0450-458b-9a93-86b062156f37-config-volume\") pod \"collect-profiles-29401260-td9h7\" (UID: \"6e05e6b7-0450-458b-9a93-86b062156f37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.274960 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e05e6b7-0450-458b-9a93-86b062156f37-secret-volume\") pod \"collect-profiles-29401260-td9h7\" (UID: \"6e05e6b7-0450-458b-9a93-86b062156f37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.275212 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5fp7\" (UniqueName: \"kubernetes.io/projected/6e05e6b7-0450-458b-9a93-86b062156f37-kube-api-access-d5fp7\") pod \"collect-profiles-29401260-td9h7\" (UID: \"6e05e6b7-0450-458b-9a93-86b062156f37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.377026 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e05e6b7-0450-458b-9a93-86b062156f37-config-volume\") pod \"collect-profiles-29401260-td9h7\" (UID: \"6e05e6b7-0450-458b-9a93-86b062156f37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.377119 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e05e6b7-0450-458b-9a93-86b062156f37-secret-volume\") pod \"collect-profiles-29401260-td9h7\" (UID: \"6e05e6b7-0450-458b-9a93-86b062156f37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.377224 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5fp7\" (UniqueName: \"kubernetes.io/projected/6e05e6b7-0450-458b-9a93-86b062156f37-kube-api-access-d5fp7\") pod \"collect-profiles-29401260-td9h7\" (UID: \"6e05e6b7-0450-458b-9a93-86b062156f37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.378168 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e05e6b7-0450-458b-9a93-86b062156f37-config-volume\") pod \"collect-profiles-29401260-td9h7\" (UID: \"6e05e6b7-0450-458b-9a93-86b062156f37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.385247 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e05e6b7-0450-458b-9a93-86b062156f37-secret-volume\") pod \"collect-profiles-29401260-td9h7\" (UID: \"6e05e6b7-0450-458b-9a93-86b062156f37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.393723 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5fp7\" (UniqueName: \"kubernetes.io/projected/6e05e6b7-0450-458b-9a93-86b062156f37-kube-api-access-d5fp7\") pod \"collect-profiles-29401260-td9h7\" (UID: \"6e05e6b7-0450-458b-9a93-86b062156f37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.478390 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:00 crc kubenswrapper[4693]: I1125 13:00:00.902106 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7"] Nov 25 13:00:00 crc kubenswrapper[4693]: W1125 13:00:00.907503 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e05e6b7_0450_458b_9a93_86b062156f37.slice/crio-a3db25356c93923b87cd62c02227f4c39446420acd837f6d0d2a56bd19ee0d27 WatchSource:0}: Error finding container a3db25356c93923b87cd62c02227f4c39446420acd837f6d0d2a56bd19ee0d27: Status 404 returned error can't find the container with id a3db25356c93923b87cd62c02227f4c39446420acd837f6d0d2a56bd19ee0d27 Nov 25 13:00:01 crc kubenswrapper[4693]: I1125 13:00:01.562050 4693 generic.go:334] "Generic (PLEG): container finished" podID="6e05e6b7-0450-458b-9a93-86b062156f37" containerID="096fd666b3967d599fb8999a193369f555ea3897b676549a4a8af8eda8c7fbdd" exitCode=0 Nov 25 13:00:01 crc kubenswrapper[4693]: I1125 13:00:01.562146 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" event={"ID":"6e05e6b7-0450-458b-9a93-86b062156f37","Type":"ContainerDied","Data":"096fd666b3967d599fb8999a193369f555ea3897b676549a4a8af8eda8c7fbdd"} Nov 25 13:00:01 crc kubenswrapper[4693]: I1125 13:00:01.562347 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" event={"ID":"6e05e6b7-0450-458b-9a93-86b062156f37","Type":"ContainerStarted","Data":"a3db25356c93923b87cd62c02227f4c39446420acd837f6d0d2a56bd19ee0d27"} Nov 25 13:00:02 crc kubenswrapper[4693]: I1125 13:00:02.814636 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:00:02 crc kubenswrapper[4693]: E1125 13:00:02.815033 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:00:02 crc kubenswrapper[4693]: I1125 13:00:02.913104 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.031502 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e05e6b7-0450-458b-9a93-86b062156f37-config-volume\") pod \"6e05e6b7-0450-458b-9a93-86b062156f37\" (UID: \"6e05e6b7-0450-458b-9a93-86b062156f37\") " Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.032271 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e05e6b7-0450-458b-9a93-86b062156f37-secret-volume\") pod \"6e05e6b7-0450-458b-9a93-86b062156f37\" (UID: \"6e05e6b7-0450-458b-9a93-86b062156f37\") " Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.032412 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e05e6b7-0450-458b-9a93-86b062156f37-config-volume" (OuterVolumeSpecName: "config-volume") pod "6e05e6b7-0450-458b-9a93-86b062156f37" (UID: "6e05e6b7-0450-458b-9a93-86b062156f37"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.032548 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5fp7\" (UniqueName: \"kubernetes.io/projected/6e05e6b7-0450-458b-9a93-86b062156f37-kube-api-access-d5fp7\") pod \"6e05e6b7-0450-458b-9a93-86b062156f37\" (UID: \"6e05e6b7-0450-458b-9a93-86b062156f37\") " Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.033102 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e05e6b7-0450-458b-9a93-86b062156f37-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.038349 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e05e6b7-0450-458b-9a93-86b062156f37-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6e05e6b7-0450-458b-9a93-86b062156f37" (UID: "6e05e6b7-0450-458b-9a93-86b062156f37"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.044627 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e05e6b7-0450-458b-9a93-86b062156f37-kube-api-access-d5fp7" (OuterVolumeSpecName: "kube-api-access-d5fp7") pod "6e05e6b7-0450-458b-9a93-86b062156f37" (UID: "6e05e6b7-0450-458b-9a93-86b062156f37"). InnerVolumeSpecName "kube-api-access-d5fp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.134624 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5fp7\" (UniqueName: \"kubernetes.io/projected/6e05e6b7-0450-458b-9a93-86b062156f37-kube-api-access-d5fp7\") on node \"crc\" DevicePath \"\"" Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.134661 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e05e6b7-0450-458b-9a93-86b062156f37-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.584638 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" event={"ID":"6e05e6b7-0450-458b-9a93-86b062156f37","Type":"ContainerDied","Data":"a3db25356c93923b87cd62c02227f4c39446420acd837f6d0d2a56bd19ee0d27"} Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.584697 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3db25356c93923b87cd62c02227f4c39446420acd837f6d0d2a56bd19ee0d27" Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.584696 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401260-td9h7" Nov 25 13:00:03 crc kubenswrapper[4693]: I1125 13:00:03.990210 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9"] Nov 25 13:00:04 crc kubenswrapper[4693]: I1125 13:00:04.000716 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401215-qv6t9"] Nov 25 13:00:04 crc kubenswrapper[4693]: I1125 13:00:04.827266 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d" path="/var/lib/kubelet/pods/401b1f6c-b0a2-4a70-8f4a-2e1a5537c00d/volumes" Nov 25 13:00:17 crc kubenswrapper[4693]: I1125 13:00:17.812990 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:00:17 crc kubenswrapper[4693]: E1125 13:00:17.813889 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.212667 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5vb8q"] Nov 25 13:00:21 crc kubenswrapper[4693]: E1125 13:00:21.213940 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e05e6b7-0450-458b-9a93-86b062156f37" containerName="collect-profiles" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.213960 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e05e6b7-0450-458b-9a93-86b062156f37" containerName="collect-profiles" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.214218 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e05e6b7-0450-458b-9a93-86b062156f37" containerName="collect-profiles" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.216080 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.225691 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5vb8q"] Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.242385 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e771b3cc-5f13-42d5-979c-c075104676ce-utilities\") pod \"redhat-operators-5vb8q\" (UID: \"e771b3cc-5f13-42d5-979c-c075104676ce\") " pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.242445 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e771b3cc-5f13-42d5-979c-c075104676ce-catalog-content\") pod \"redhat-operators-5vb8q\" (UID: \"e771b3cc-5f13-42d5-979c-c075104676ce\") " pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.242466 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mwck\" (UniqueName: \"kubernetes.io/projected/e771b3cc-5f13-42d5-979c-c075104676ce-kube-api-access-9mwck\") pod \"redhat-operators-5vb8q\" (UID: \"e771b3cc-5f13-42d5-979c-c075104676ce\") " pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.345483 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e771b3cc-5f13-42d5-979c-c075104676ce-utilities\") pod \"redhat-operators-5vb8q\" (UID: \"e771b3cc-5f13-42d5-979c-c075104676ce\") " pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.345570 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e771b3cc-5f13-42d5-979c-c075104676ce-catalog-content\") pod \"redhat-operators-5vb8q\" (UID: \"e771b3cc-5f13-42d5-979c-c075104676ce\") " pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.345602 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mwck\" (UniqueName: \"kubernetes.io/projected/e771b3cc-5f13-42d5-979c-c075104676ce-kube-api-access-9mwck\") pod \"redhat-operators-5vb8q\" (UID: \"e771b3cc-5f13-42d5-979c-c075104676ce\") " pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.346085 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e771b3cc-5f13-42d5-979c-c075104676ce-catalog-content\") pod \"redhat-operators-5vb8q\" (UID: \"e771b3cc-5f13-42d5-979c-c075104676ce\") " pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.346218 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e771b3cc-5f13-42d5-979c-c075104676ce-utilities\") pod \"redhat-operators-5vb8q\" (UID: \"e771b3cc-5f13-42d5-979c-c075104676ce\") " pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.364312 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mwck\" (UniqueName: \"kubernetes.io/projected/e771b3cc-5f13-42d5-979c-c075104676ce-kube-api-access-9mwck\") pod \"redhat-operators-5vb8q\" (UID: \"e771b3cc-5f13-42d5-979c-c075104676ce\") " pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:21 crc kubenswrapper[4693]: I1125 13:00:21.545984 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:22 crc kubenswrapper[4693]: I1125 13:00:22.029414 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5vb8q"] Nov 25 13:00:22 crc kubenswrapper[4693]: I1125 13:00:22.770272 4693 generic.go:334] "Generic (PLEG): container finished" podID="e771b3cc-5f13-42d5-979c-c075104676ce" containerID="bb8f62dff6eb80f5a7633f0a90601f55cde8dcc43318a4ab3ae267fd6bb62fd4" exitCode=0 Nov 25 13:00:22 crc kubenswrapper[4693]: I1125 13:00:22.770593 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vb8q" event={"ID":"e771b3cc-5f13-42d5-979c-c075104676ce","Type":"ContainerDied","Data":"bb8f62dff6eb80f5a7633f0a90601f55cde8dcc43318a4ab3ae267fd6bb62fd4"} Nov 25 13:00:22 crc kubenswrapper[4693]: I1125 13:00:22.770661 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vb8q" event={"ID":"e771b3cc-5f13-42d5-979c-c075104676ce","Type":"ContainerStarted","Data":"a6684f742fe4bfef184a17882105ac91129656e8bdcc5467c885ddac9e749d2b"} Nov 25 13:00:22 crc kubenswrapper[4693]: I1125 13:00:22.772527 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 13:00:23 crc kubenswrapper[4693]: I1125 13:00:23.783722 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vb8q" event={"ID":"e771b3cc-5f13-42d5-979c-c075104676ce","Type":"ContainerStarted","Data":"a397a1852fa7fbe8d8b5ebf44ece9469db693832795b39476a67d7d403feef52"} Nov 25 13:00:25 crc kubenswrapper[4693]: I1125 13:00:25.801996 4693 generic.go:334] "Generic (PLEG): container finished" podID="e771b3cc-5f13-42d5-979c-c075104676ce" containerID="a397a1852fa7fbe8d8b5ebf44ece9469db693832795b39476a67d7d403feef52" exitCode=0 Nov 25 13:00:25 crc kubenswrapper[4693]: I1125 13:00:25.802066 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vb8q" event={"ID":"e771b3cc-5f13-42d5-979c-c075104676ce","Type":"ContainerDied","Data":"a397a1852fa7fbe8d8b5ebf44ece9469db693832795b39476a67d7d403feef52"} Nov 25 13:00:26 crc kubenswrapper[4693]: I1125 13:00:26.828191 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vb8q" event={"ID":"e771b3cc-5f13-42d5-979c-c075104676ce","Type":"ContainerStarted","Data":"06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519"} Nov 25 13:00:26 crc kubenswrapper[4693]: I1125 13:00:26.855547 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5vb8q" podStartSLOduration=2.264929185 podStartE2EDuration="5.855528065s" podCreationTimestamp="2025-11-25 13:00:21 +0000 UTC" firstStartedPulling="2025-11-25 13:00:22.77227276 +0000 UTC m=+3142.690358141" lastFinishedPulling="2025-11-25 13:00:26.36287164 +0000 UTC m=+3146.280957021" observedRunningTime="2025-11-25 13:00:26.848175555 +0000 UTC m=+3146.766260936" watchObservedRunningTime="2025-11-25 13:00:26.855528065 +0000 UTC m=+3146.773613446" Nov 25 13:00:28 crc kubenswrapper[4693]: I1125 13:00:28.905649 4693 scope.go:117] "RemoveContainer" containerID="095e216724e167ef0c237df79c4ffcb351ce969a7ff1ea062d45cf98bd36f7e4" Nov 25 13:00:29 crc kubenswrapper[4693]: I1125 13:00:29.813909 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:00:29 crc kubenswrapper[4693]: E1125 13:00:29.814409 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:00:31 crc kubenswrapper[4693]: I1125 13:00:31.546355 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:31 crc kubenswrapper[4693]: I1125 13:00:31.547462 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:32 crc kubenswrapper[4693]: I1125 13:00:32.616628 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5vb8q" podUID="e771b3cc-5f13-42d5-979c-c075104676ce" containerName="registry-server" probeResult="failure" output=< Nov 25 13:00:32 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 13:00:32 crc kubenswrapper[4693]: > Nov 25 13:00:41 crc kubenswrapper[4693]: I1125 13:00:41.613486 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:41 crc kubenswrapper[4693]: I1125 13:00:41.675122 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:41 crc kubenswrapper[4693]: I1125 13:00:41.862444 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5vb8q"] Nov 25 13:00:42 crc kubenswrapper[4693]: I1125 13:00:42.996019 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5vb8q" podUID="e771b3cc-5f13-42d5-979c-c075104676ce" containerName="registry-server" containerID="cri-o://06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519" gracePeriod=2 Nov 25 13:00:43 crc kubenswrapper[4693]: I1125 13:00:43.537336 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:43 crc kubenswrapper[4693]: I1125 13:00:43.679700 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e771b3cc-5f13-42d5-979c-c075104676ce-catalog-content\") pod \"e771b3cc-5f13-42d5-979c-c075104676ce\" (UID: \"e771b3cc-5f13-42d5-979c-c075104676ce\") " Nov 25 13:00:43 crc kubenswrapper[4693]: I1125 13:00:43.679825 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mwck\" (UniqueName: \"kubernetes.io/projected/e771b3cc-5f13-42d5-979c-c075104676ce-kube-api-access-9mwck\") pod \"e771b3cc-5f13-42d5-979c-c075104676ce\" (UID: \"e771b3cc-5f13-42d5-979c-c075104676ce\") " Nov 25 13:00:43 crc kubenswrapper[4693]: I1125 13:00:43.679908 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e771b3cc-5f13-42d5-979c-c075104676ce-utilities\") pod \"e771b3cc-5f13-42d5-979c-c075104676ce\" (UID: \"e771b3cc-5f13-42d5-979c-c075104676ce\") " Nov 25 13:00:43 crc kubenswrapper[4693]: I1125 13:00:43.680805 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e771b3cc-5f13-42d5-979c-c075104676ce-utilities" (OuterVolumeSpecName: "utilities") pod "e771b3cc-5f13-42d5-979c-c075104676ce" (UID: "e771b3cc-5f13-42d5-979c-c075104676ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:00:43 crc kubenswrapper[4693]: I1125 13:00:43.685410 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e771b3cc-5f13-42d5-979c-c075104676ce-kube-api-access-9mwck" (OuterVolumeSpecName: "kube-api-access-9mwck") pod "e771b3cc-5f13-42d5-979c-c075104676ce" (UID: "e771b3cc-5f13-42d5-979c-c075104676ce"). InnerVolumeSpecName "kube-api-access-9mwck". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:00:43 crc kubenswrapper[4693]: I1125 13:00:43.771799 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e771b3cc-5f13-42d5-979c-c075104676ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e771b3cc-5f13-42d5-979c-c075104676ce" (UID: "e771b3cc-5f13-42d5-979c-c075104676ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:00:43 crc kubenswrapper[4693]: I1125 13:00:43.782020 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e771b3cc-5f13-42d5-979c-c075104676ce-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 13:00:43 crc kubenswrapper[4693]: I1125 13:00:43.782053 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mwck\" (UniqueName: \"kubernetes.io/projected/e771b3cc-5f13-42d5-979c-c075104676ce-kube-api-access-9mwck\") on node \"crc\" DevicePath \"\"" Nov 25 13:00:43 crc kubenswrapper[4693]: I1125 13:00:43.782067 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e771b3cc-5f13-42d5-979c-c075104676ce-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 13:00:43 crc kubenswrapper[4693]: I1125 13:00:43.812899 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:00:43 crc kubenswrapper[4693]: E1125 13:00:43.813144 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.005146 4693 generic.go:334] "Generic (PLEG): container finished" podID="e771b3cc-5f13-42d5-979c-c075104676ce" containerID="06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519" exitCode=0 Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.005207 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5vb8q" Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.005199 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vb8q" event={"ID":"e771b3cc-5f13-42d5-979c-c075104676ce","Type":"ContainerDied","Data":"06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519"} Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.005338 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5vb8q" event={"ID":"e771b3cc-5f13-42d5-979c-c075104676ce","Type":"ContainerDied","Data":"a6684f742fe4bfef184a17882105ac91129656e8bdcc5467c885ddac9e749d2b"} Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.005363 4693 scope.go:117] "RemoveContainer" containerID="06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519" Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.044059 4693 scope.go:117] "RemoveContainer" containerID="a397a1852fa7fbe8d8b5ebf44ece9469db693832795b39476a67d7d403feef52" Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.060834 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5vb8q"] Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.069652 4693 scope.go:117] "RemoveContainer" containerID="bb8f62dff6eb80f5a7633f0a90601f55cde8dcc43318a4ab3ae267fd6bb62fd4" Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.079204 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5vb8q"] Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.142159 4693 scope.go:117] "RemoveContainer" containerID="06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519" Nov 25 13:00:44 crc kubenswrapper[4693]: E1125 13:00:44.142692 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519\": container with ID starting with 06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519 not found: ID does not exist" containerID="06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519" Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.142787 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519"} err="failed to get container status \"06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519\": rpc error: code = NotFound desc = could not find container \"06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519\": container with ID starting with 06ba5a0412532ea24bcde6996d2244552a33603be9fd75654a65a8025fe1a519 not found: ID does not exist" Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.142844 4693 scope.go:117] "RemoveContainer" containerID="a397a1852fa7fbe8d8b5ebf44ece9469db693832795b39476a67d7d403feef52" Nov 25 13:00:44 crc kubenswrapper[4693]: E1125 13:00:44.143463 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a397a1852fa7fbe8d8b5ebf44ece9469db693832795b39476a67d7d403feef52\": container with ID starting with a397a1852fa7fbe8d8b5ebf44ece9469db693832795b39476a67d7d403feef52 not found: ID does not exist" containerID="a397a1852fa7fbe8d8b5ebf44ece9469db693832795b39476a67d7d403feef52" Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.143514 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a397a1852fa7fbe8d8b5ebf44ece9469db693832795b39476a67d7d403feef52"} err="failed to get container status \"a397a1852fa7fbe8d8b5ebf44ece9469db693832795b39476a67d7d403feef52\": rpc error: code = NotFound desc = could not find container \"a397a1852fa7fbe8d8b5ebf44ece9469db693832795b39476a67d7d403feef52\": container with ID starting with a397a1852fa7fbe8d8b5ebf44ece9469db693832795b39476a67d7d403feef52 not found: ID does not exist" Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.143544 4693 scope.go:117] "RemoveContainer" containerID="bb8f62dff6eb80f5a7633f0a90601f55cde8dcc43318a4ab3ae267fd6bb62fd4" Nov 25 13:00:44 crc kubenswrapper[4693]: E1125 13:00:44.143892 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8f62dff6eb80f5a7633f0a90601f55cde8dcc43318a4ab3ae267fd6bb62fd4\": container with ID starting with bb8f62dff6eb80f5a7633f0a90601f55cde8dcc43318a4ab3ae267fd6bb62fd4 not found: ID does not exist" containerID="bb8f62dff6eb80f5a7633f0a90601f55cde8dcc43318a4ab3ae267fd6bb62fd4" Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.143957 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8f62dff6eb80f5a7633f0a90601f55cde8dcc43318a4ab3ae267fd6bb62fd4"} err="failed to get container status \"bb8f62dff6eb80f5a7633f0a90601f55cde8dcc43318a4ab3ae267fd6bb62fd4\": rpc error: code = NotFound desc = could not find container \"bb8f62dff6eb80f5a7633f0a90601f55cde8dcc43318a4ab3ae267fd6bb62fd4\": container with ID starting with bb8f62dff6eb80f5a7633f0a90601f55cde8dcc43318a4ab3ae267fd6bb62fd4 not found: ID does not exist" Nov 25 13:00:44 crc kubenswrapper[4693]: I1125 13:00:44.831221 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e771b3cc-5f13-42d5-979c-c075104676ce" path="/var/lib/kubelet/pods/e771b3cc-5f13-42d5-979c-c075104676ce/volumes" Nov 25 13:00:54 crc kubenswrapper[4693]: I1125 13:00:54.813646 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:00:54 crc kubenswrapper[4693]: E1125 13:00:54.814674 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.156950 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29401261-7sn2t"] Nov 25 13:01:00 crc kubenswrapper[4693]: E1125 13:01:00.157774 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e771b3cc-5f13-42d5-979c-c075104676ce" containerName="extract-content" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.157795 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e771b3cc-5f13-42d5-979c-c075104676ce" containerName="extract-content" Nov 25 13:01:00 crc kubenswrapper[4693]: E1125 13:01:00.157817 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e771b3cc-5f13-42d5-979c-c075104676ce" containerName="registry-server" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.157826 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e771b3cc-5f13-42d5-979c-c075104676ce" containerName="registry-server" Nov 25 13:01:00 crc kubenswrapper[4693]: E1125 13:01:00.157878 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e771b3cc-5f13-42d5-979c-c075104676ce" containerName="extract-utilities" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.157888 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e771b3cc-5f13-42d5-979c-c075104676ce" containerName="extract-utilities" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.158146 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="e771b3cc-5f13-42d5-979c-c075104676ce" containerName="registry-server" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.159025 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.189454 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29401261-7sn2t"] Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.222888 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-combined-ca-bundle\") pod \"keystone-cron-29401261-7sn2t\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.223291 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-fernet-keys\") pod \"keystone-cron-29401261-7sn2t\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.223343 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqn6g\" (UniqueName: \"kubernetes.io/projected/3306a30d-dcff-4460-81f2-3561573e57a2-kube-api-access-vqn6g\") pod \"keystone-cron-29401261-7sn2t\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.223391 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-config-data\") pod \"keystone-cron-29401261-7sn2t\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.325409 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-fernet-keys\") pod \"keystone-cron-29401261-7sn2t\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.325453 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqn6g\" (UniqueName: \"kubernetes.io/projected/3306a30d-dcff-4460-81f2-3561573e57a2-kube-api-access-vqn6g\") pod \"keystone-cron-29401261-7sn2t\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.325473 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-config-data\") pod \"keystone-cron-29401261-7sn2t\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.325555 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-combined-ca-bundle\") pod \"keystone-cron-29401261-7sn2t\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.332174 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-combined-ca-bundle\") pod \"keystone-cron-29401261-7sn2t\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.343472 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqn6g\" (UniqueName: \"kubernetes.io/projected/3306a30d-dcff-4460-81f2-3561573e57a2-kube-api-access-vqn6g\") pod \"keystone-cron-29401261-7sn2t\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.344001 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-fernet-keys\") pod \"keystone-cron-29401261-7sn2t\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.352512 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-config-data\") pod \"keystone-cron-29401261-7sn2t\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.480076 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:00 crc kubenswrapper[4693]: I1125 13:01:00.943640 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29401261-7sn2t"] Nov 25 13:01:01 crc kubenswrapper[4693]: I1125 13:01:01.205417 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29401261-7sn2t" event={"ID":"3306a30d-dcff-4460-81f2-3561573e57a2","Type":"ContainerStarted","Data":"cacb46728df86a01d49f69d7ab942f918c796c248a147bf907e7acac346d3c12"} Nov 25 13:01:01 crc kubenswrapper[4693]: I1125 13:01:01.205508 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29401261-7sn2t" event={"ID":"3306a30d-dcff-4460-81f2-3561573e57a2","Type":"ContainerStarted","Data":"674e6ffc403a1f40289ac2df1a1aa61b0ef1884dabdd52a780bde3122c8bb3e9"} Nov 25 13:01:01 crc kubenswrapper[4693]: I1125 13:01:01.226534 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29401261-7sn2t" podStartSLOduration=1.226516409 podStartE2EDuration="1.226516409s" podCreationTimestamp="2025-11-25 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 13:01:01.224869005 +0000 UTC m=+3181.142954386" watchObservedRunningTime="2025-11-25 13:01:01.226516409 +0000 UTC m=+3181.144601780" Nov 25 13:01:03 crc kubenswrapper[4693]: I1125 13:01:03.225253 4693 generic.go:334] "Generic (PLEG): container finished" podID="3306a30d-dcff-4460-81f2-3561573e57a2" containerID="cacb46728df86a01d49f69d7ab942f918c796c248a147bf907e7acac346d3c12" exitCode=0 Nov 25 13:01:03 crc kubenswrapper[4693]: I1125 13:01:03.225427 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29401261-7sn2t" event={"ID":"3306a30d-dcff-4460-81f2-3561573e57a2","Type":"ContainerDied","Data":"cacb46728df86a01d49f69d7ab942f918c796c248a147bf907e7acac346d3c12"} Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.655787 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.712581 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-config-data\") pod \"3306a30d-dcff-4460-81f2-3561573e57a2\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.712664 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-combined-ca-bundle\") pod \"3306a30d-dcff-4460-81f2-3561573e57a2\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.712732 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqn6g\" (UniqueName: \"kubernetes.io/projected/3306a30d-dcff-4460-81f2-3561573e57a2-kube-api-access-vqn6g\") pod \"3306a30d-dcff-4460-81f2-3561573e57a2\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.712826 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-fernet-keys\") pod \"3306a30d-dcff-4460-81f2-3561573e57a2\" (UID: \"3306a30d-dcff-4460-81f2-3561573e57a2\") " Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.718299 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3306a30d-dcff-4460-81f2-3561573e57a2-kube-api-access-vqn6g" (OuterVolumeSpecName: "kube-api-access-vqn6g") pod "3306a30d-dcff-4460-81f2-3561573e57a2" (UID: "3306a30d-dcff-4460-81f2-3561573e57a2"). InnerVolumeSpecName "kube-api-access-vqn6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.718326 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3306a30d-dcff-4460-81f2-3561573e57a2" (UID: "3306a30d-dcff-4460-81f2-3561573e57a2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.762447 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3306a30d-dcff-4460-81f2-3561573e57a2" (UID: "3306a30d-dcff-4460-81f2-3561573e57a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.781274 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-config-data" (OuterVolumeSpecName: "config-data") pod "3306a30d-dcff-4460-81f2-3561573e57a2" (UID: "3306a30d-dcff-4460-81f2-3561573e57a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.815763 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.815793 4693 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.815806 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqn6g\" (UniqueName: \"kubernetes.io/projected/3306a30d-dcff-4460-81f2-3561573e57a2-kube-api-access-vqn6g\") on node \"crc\" DevicePath \"\"" Nov 25 13:01:04 crc kubenswrapper[4693]: I1125 13:01:04.815818 4693 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3306a30d-dcff-4460-81f2-3561573e57a2-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 25 13:01:05 crc kubenswrapper[4693]: I1125 13:01:05.245108 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29401261-7sn2t" event={"ID":"3306a30d-dcff-4460-81f2-3561573e57a2","Type":"ContainerDied","Data":"674e6ffc403a1f40289ac2df1a1aa61b0ef1884dabdd52a780bde3122c8bb3e9"} Nov 25 13:01:05 crc kubenswrapper[4693]: I1125 13:01:05.245164 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="674e6ffc403a1f40289ac2df1a1aa61b0ef1884dabdd52a780bde3122c8bb3e9" Nov 25 13:01:05 crc kubenswrapper[4693]: I1125 13:01:05.245183 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29401261-7sn2t" Nov 25 13:01:09 crc kubenswrapper[4693]: I1125 13:01:09.813422 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:01:09 crc kubenswrapper[4693]: E1125 13:01:09.814241 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:01:20 crc kubenswrapper[4693]: I1125 13:01:20.819280 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:01:20 crc kubenswrapper[4693]: E1125 13:01:20.821221 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:01:35 crc kubenswrapper[4693]: I1125 13:01:35.812736 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:01:35 crc kubenswrapper[4693]: E1125 13:01:35.813545 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:01:48 crc kubenswrapper[4693]: I1125 13:01:48.813579 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:01:48 crc kubenswrapper[4693]: E1125 13:01:48.814425 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:02:00 crc kubenswrapper[4693]: I1125 13:02:00.827546 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:02:00 crc kubenswrapper[4693]: E1125 13:02:00.828555 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:02:14 crc kubenswrapper[4693]: I1125 13:02:14.813038 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:02:14 crc kubenswrapper[4693]: E1125 13:02:14.813987 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:02:26 crc kubenswrapper[4693]: I1125 13:02:26.813593 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:02:26 crc kubenswrapper[4693]: E1125 13:02:26.814622 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:02:37 crc kubenswrapper[4693]: I1125 13:02:37.813433 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:02:38 crc kubenswrapper[4693]: I1125 13:02:38.182071 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"0788863ba245e40e8c7c4d3419b23c74da4cee4596a6c4576579fcfae24fd5a8"} Nov 25 13:05:05 crc kubenswrapper[4693]: I1125 13:05:05.113926 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:05:05 crc kubenswrapper[4693]: I1125 13:05:05.114483 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:05:23 crc kubenswrapper[4693]: I1125 13:05:23.722168 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4t7sd"] Nov 25 13:05:23 crc kubenswrapper[4693]: E1125 13:05:23.723154 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3306a30d-dcff-4460-81f2-3561573e57a2" containerName="keystone-cron" Nov 25 13:05:23 crc kubenswrapper[4693]: I1125 13:05:23.723172 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="3306a30d-dcff-4460-81f2-3561573e57a2" containerName="keystone-cron" Nov 25 13:05:23 crc kubenswrapper[4693]: I1125 13:05:23.723518 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="3306a30d-dcff-4460-81f2-3561573e57a2" containerName="keystone-cron" Nov 25 13:05:23 crc kubenswrapper[4693]: I1125 13:05:23.725297 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:23 crc kubenswrapper[4693]: I1125 13:05:23.734049 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t7sd"] Nov 25 13:05:23 crc kubenswrapper[4693]: I1125 13:05:23.900535 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caae5a26-497c-443f-8f89-e934009f811a-catalog-content\") pod \"redhat-marketplace-4t7sd\" (UID: \"caae5a26-497c-443f-8f89-e934009f811a\") " pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:23 crc kubenswrapper[4693]: I1125 13:05:23.900613 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caae5a26-497c-443f-8f89-e934009f811a-utilities\") pod \"redhat-marketplace-4t7sd\" (UID: \"caae5a26-497c-443f-8f89-e934009f811a\") " pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:23 crc kubenswrapper[4693]: I1125 13:05:23.900666 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8mlq\" (UniqueName: \"kubernetes.io/projected/caae5a26-497c-443f-8f89-e934009f811a-kube-api-access-v8mlq\") pod \"redhat-marketplace-4t7sd\" (UID: \"caae5a26-497c-443f-8f89-e934009f811a\") " pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:24 crc kubenswrapper[4693]: I1125 13:05:24.002320 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caae5a26-497c-443f-8f89-e934009f811a-catalog-content\") pod \"redhat-marketplace-4t7sd\" (UID: \"caae5a26-497c-443f-8f89-e934009f811a\") " pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:24 crc kubenswrapper[4693]: I1125 13:05:24.002443 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caae5a26-497c-443f-8f89-e934009f811a-utilities\") pod \"redhat-marketplace-4t7sd\" (UID: \"caae5a26-497c-443f-8f89-e934009f811a\") " pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:24 crc kubenswrapper[4693]: I1125 13:05:24.002500 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8mlq\" (UniqueName: \"kubernetes.io/projected/caae5a26-497c-443f-8f89-e934009f811a-kube-api-access-v8mlq\") pod \"redhat-marketplace-4t7sd\" (UID: \"caae5a26-497c-443f-8f89-e934009f811a\") " pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:24 crc kubenswrapper[4693]: I1125 13:05:24.003004 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caae5a26-497c-443f-8f89-e934009f811a-catalog-content\") pod \"redhat-marketplace-4t7sd\" (UID: \"caae5a26-497c-443f-8f89-e934009f811a\") " pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:24 crc kubenswrapper[4693]: I1125 13:05:24.003028 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caae5a26-497c-443f-8f89-e934009f811a-utilities\") pod \"redhat-marketplace-4t7sd\" (UID: \"caae5a26-497c-443f-8f89-e934009f811a\") " pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:24 crc kubenswrapper[4693]: I1125 13:05:24.031392 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8mlq\" (UniqueName: \"kubernetes.io/projected/caae5a26-497c-443f-8f89-e934009f811a-kube-api-access-v8mlq\") pod \"redhat-marketplace-4t7sd\" (UID: \"caae5a26-497c-443f-8f89-e934009f811a\") " pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:24 crc kubenswrapper[4693]: I1125 13:05:24.046600 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:24 crc kubenswrapper[4693]: W1125 13:05:24.699518 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaae5a26_497c_443f_8f89_e934009f811a.slice/crio-af7c57fa90af9a3dfddb2f8490c94abaeda3d334c4396f444e90c7260b5ab1e7 WatchSource:0}: Error finding container af7c57fa90af9a3dfddb2f8490c94abaeda3d334c4396f444e90c7260b5ab1e7: Status 404 returned error can't find the container with id af7c57fa90af9a3dfddb2f8490c94abaeda3d334c4396f444e90c7260b5ab1e7 Nov 25 13:05:24 crc kubenswrapper[4693]: I1125 13:05:24.709324 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t7sd"] Nov 25 13:05:24 crc kubenswrapper[4693]: I1125 13:05:24.880157 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t7sd" event={"ID":"caae5a26-497c-443f-8f89-e934009f811a","Type":"ContainerStarted","Data":"af7c57fa90af9a3dfddb2f8490c94abaeda3d334c4396f444e90c7260b5ab1e7"} Nov 25 13:05:25 crc kubenswrapper[4693]: I1125 13:05:25.891423 4693 generic.go:334] "Generic (PLEG): container finished" podID="caae5a26-497c-443f-8f89-e934009f811a" containerID="74d548b7f37e1add583cc207986d18bee01579012d949fc33843bd226aa6c243" exitCode=0 Nov 25 13:05:25 crc kubenswrapper[4693]: I1125 13:05:25.892528 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t7sd" event={"ID":"caae5a26-497c-443f-8f89-e934009f811a","Type":"ContainerDied","Data":"74d548b7f37e1add583cc207986d18bee01579012d949fc33843bd226aa6c243"} Nov 25 13:05:25 crc kubenswrapper[4693]: I1125 13:05:25.895409 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 13:05:27 crc kubenswrapper[4693]: I1125 13:05:27.913193 4693 generic.go:334] "Generic (PLEG): container finished" podID="caae5a26-497c-443f-8f89-e934009f811a" containerID="2cfd268aba2988bbd3d2ed4e41dd259173d309f48167b6994b2baa719de08020" exitCode=0 Nov 25 13:05:27 crc kubenswrapper[4693]: I1125 13:05:27.913286 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t7sd" event={"ID":"caae5a26-497c-443f-8f89-e934009f811a","Type":"ContainerDied","Data":"2cfd268aba2988bbd3d2ed4e41dd259173d309f48167b6994b2baa719de08020"} Nov 25 13:05:28 crc kubenswrapper[4693]: I1125 13:05:28.928215 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t7sd" event={"ID":"caae5a26-497c-443f-8f89-e934009f811a","Type":"ContainerStarted","Data":"7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf"} Nov 25 13:05:28 crc kubenswrapper[4693]: I1125 13:05:28.949711 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4t7sd" podStartSLOduration=3.216865249 podStartE2EDuration="5.949689214s" podCreationTimestamp="2025-11-25 13:05:23 +0000 UTC" firstStartedPulling="2025-11-25 13:05:25.895025521 +0000 UTC m=+3445.813110912" lastFinishedPulling="2025-11-25 13:05:28.627849496 +0000 UTC m=+3448.545934877" observedRunningTime="2025-11-25 13:05:28.946674122 +0000 UTC m=+3448.864759503" watchObservedRunningTime="2025-11-25 13:05:28.949689214 +0000 UTC m=+3448.867774595" Nov 25 13:05:34 crc kubenswrapper[4693]: I1125 13:05:34.047611 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:34 crc kubenswrapper[4693]: I1125 13:05:34.049107 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:34 crc kubenswrapper[4693]: I1125 13:05:34.107999 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:35 crc kubenswrapper[4693]: I1125 13:05:35.045311 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:35 crc kubenswrapper[4693]: I1125 13:05:35.113293 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:05:35 crc kubenswrapper[4693]: I1125 13:05:35.113360 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:05:36 crc kubenswrapper[4693]: I1125 13:05:36.118680 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t7sd"] Nov 25 13:05:38 crc kubenswrapper[4693]: I1125 13:05:38.029884 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4t7sd" podUID="caae5a26-497c-443f-8f89-e934009f811a" containerName="registry-server" containerID="cri-o://7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf" gracePeriod=2 Nov 25 13:05:38 crc kubenswrapper[4693]: E1125 13:05:38.185108 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaae5a26_497c_443f_8f89_e934009f811a.slice/crio-7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf.scope\": RecentStats: unable to find data in memory cache]" Nov 25 13:05:38 crc kubenswrapper[4693]: I1125 13:05:38.619885 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:38 crc kubenswrapper[4693]: I1125 13:05:38.709637 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caae5a26-497c-443f-8f89-e934009f811a-utilities\") pod \"caae5a26-497c-443f-8f89-e934009f811a\" (UID: \"caae5a26-497c-443f-8f89-e934009f811a\") " Nov 25 13:05:38 crc kubenswrapper[4693]: I1125 13:05:38.709939 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caae5a26-497c-443f-8f89-e934009f811a-catalog-content\") pod \"caae5a26-497c-443f-8f89-e934009f811a\" (UID: \"caae5a26-497c-443f-8f89-e934009f811a\") " Nov 25 13:05:38 crc kubenswrapper[4693]: I1125 13:05:38.710050 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8mlq\" (UniqueName: \"kubernetes.io/projected/caae5a26-497c-443f-8f89-e934009f811a-kube-api-access-v8mlq\") pod \"caae5a26-497c-443f-8f89-e934009f811a\" (UID: \"caae5a26-497c-443f-8f89-e934009f811a\") " Nov 25 13:05:38 crc kubenswrapper[4693]: I1125 13:05:38.710939 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caae5a26-497c-443f-8f89-e934009f811a-utilities" (OuterVolumeSpecName: "utilities") pod "caae5a26-497c-443f-8f89-e934009f811a" (UID: "caae5a26-497c-443f-8f89-e934009f811a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:05:38 crc kubenswrapper[4693]: I1125 13:05:38.726674 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caae5a26-497c-443f-8f89-e934009f811a-kube-api-access-v8mlq" (OuterVolumeSpecName: "kube-api-access-v8mlq") pod "caae5a26-497c-443f-8f89-e934009f811a" (UID: "caae5a26-497c-443f-8f89-e934009f811a"). InnerVolumeSpecName "kube-api-access-v8mlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:05:38 crc kubenswrapper[4693]: I1125 13:05:38.731427 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caae5a26-497c-443f-8f89-e934009f811a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "caae5a26-497c-443f-8f89-e934009f811a" (UID: "caae5a26-497c-443f-8f89-e934009f811a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:05:38 crc kubenswrapper[4693]: I1125 13:05:38.812644 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8mlq\" (UniqueName: \"kubernetes.io/projected/caae5a26-497c-443f-8f89-e934009f811a-kube-api-access-v8mlq\") on node \"crc\" DevicePath \"\"" Nov 25 13:05:38 crc kubenswrapper[4693]: I1125 13:05:38.812700 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caae5a26-497c-443f-8f89-e934009f811a-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 13:05:38 crc kubenswrapper[4693]: I1125 13:05:38.812714 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caae5a26-497c-443f-8f89-e934009f811a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.041218 4693 generic.go:334] "Generic (PLEG): container finished" podID="caae5a26-497c-443f-8f89-e934009f811a" containerID="7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf" exitCode=0 Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.041271 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t7sd" event={"ID":"caae5a26-497c-443f-8f89-e934009f811a","Type":"ContainerDied","Data":"7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf"} Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.041300 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t7sd" event={"ID":"caae5a26-497c-443f-8f89-e934009f811a","Type":"ContainerDied","Data":"af7c57fa90af9a3dfddb2f8490c94abaeda3d334c4396f444e90c7260b5ab1e7"} Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.041318 4693 scope.go:117] "RemoveContainer" containerID="7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf" Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.041272 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4t7sd" Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.073480 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t7sd"] Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.076765 4693 scope.go:117] "RemoveContainer" containerID="2cfd268aba2988bbd3d2ed4e41dd259173d309f48167b6994b2baa719de08020" Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.114007 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t7sd"] Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.116757 4693 scope.go:117] "RemoveContainer" containerID="74d548b7f37e1add583cc207986d18bee01579012d949fc33843bd226aa6c243" Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.168659 4693 scope.go:117] "RemoveContainer" containerID="7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf" Nov 25 13:05:39 crc kubenswrapper[4693]: E1125 13:05:39.169159 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf\": container with ID starting with 7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf not found: ID does not exist" containerID="7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf" Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.169200 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf"} err="failed to get container status \"7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf\": rpc error: code = NotFound desc = could not find container \"7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf\": container with ID starting with 7409cb1e8594ba95737bc24f969ba4c3b589943da2d154152187ccc52c00c4bf not found: ID does not exist" Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.169228 4693 scope.go:117] "RemoveContainer" containerID="2cfd268aba2988bbd3d2ed4e41dd259173d309f48167b6994b2baa719de08020" Nov 25 13:05:39 crc kubenswrapper[4693]: E1125 13:05:39.169565 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cfd268aba2988bbd3d2ed4e41dd259173d309f48167b6994b2baa719de08020\": container with ID starting with 2cfd268aba2988bbd3d2ed4e41dd259173d309f48167b6994b2baa719de08020 not found: ID does not exist" containerID="2cfd268aba2988bbd3d2ed4e41dd259173d309f48167b6994b2baa719de08020" Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.169585 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cfd268aba2988bbd3d2ed4e41dd259173d309f48167b6994b2baa719de08020"} err="failed to get container status \"2cfd268aba2988bbd3d2ed4e41dd259173d309f48167b6994b2baa719de08020\": rpc error: code = NotFound desc = could not find container \"2cfd268aba2988bbd3d2ed4e41dd259173d309f48167b6994b2baa719de08020\": container with ID starting with 2cfd268aba2988bbd3d2ed4e41dd259173d309f48167b6994b2baa719de08020 not found: ID does not exist" Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.169598 4693 scope.go:117] "RemoveContainer" containerID="74d548b7f37e1add583cc207986d18bee01579012d949fc33843bd226aa6c243" Nov 25 13:05:39 crc kubenswrapper[4693]: E1125 13:05:39.169893 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d548b7f37e1add583cc207986d18bee01579012d949fc33843bd226aa6c243\": container with ID starting with 74d548b7f37e1add583cc207986d18bee01579012d949fc33843bd226aa6c243 not found: ID does not exist" containerID="74d548b7f37e1add583cc207986d18bee01579012d949fc33843bd226aa6c243" Nov 25 13:05:39 crc kubenswrapper[4693]: I1125 13:05:39.169953 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d548b7f37e1add583cc207986d18bee01579012d949fc33843bd226aa6c243"} err="failed to get container status \"74d548b7f37e1add583cc207986d18bee01579012d949fc33843bd226aa6c243\": rpc error: code = NotFound desc = could not find container \"74d548b7f37e1add583cc207986d18bee01579012d949fc33843bd226aa6c243\": container with ID starting with 74d548b7f37e1add583cc207986d18bee01579012d949fc33843bd226aa6c243 not found: ID does not exist" Nov 25 13:05:40 crc kubenswrapper[4693]: I1125 13:05:40.827213 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caae5a26-497c-443f-8f89-e934009f811a" path="/var/lib/kubelet/pods/caae5a26-497c-443f-8f89-e934009f811a/volumes" Nov 25 13:06:05 crc kubenswrapper[4693]: I1125 13:06:05.113762 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:06:05 crc kubenswrapper[4693]: I1125 13:06:05.114287 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:06:05 crc kubenswrapper[4693]: I1125 13:06:05.114331 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 13:06:05 crc kubenswrapper[4693]: I1125 13:06:05.289470 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0788863ba245e40e8c7c4d3419b23c74da4cee4596a6c4576579fcfae24fd5a8"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 13:06:05 crc kubenswrapper[4693]: I1125 13:06:05.289536 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://0788863ba245e40e8c7c4d3419b23c74da4cee4596a6c4576579fcfae24fd5a8" gracePeriod=600 Nov 25 13:06:06 crc kubenswrapper[4693]: I1125 13:06:06.300474 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="0788863ba245e40e8c7c4d3419b23c74da4cee4596a6c4576579fcfae24fd5a8" exitCode=0 Nov 25 13:06:06 crc kubenswrapper[4693]: I1125 13:06:06.300520 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"0788863ba245e40e8c7c4d3419b23c74da4cee4596a6c4576579fcfae24fd5a8"} Nov 25 13:06:06 crc kubenswrapper[4693]: I1125 13:06:06.301007 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535"} Nov 25 13:06:06 crc kubenswrapper[4693]: I1125 13:06:06.301028 4693 scope.go:117] "RemoveContainer" containerID="3ec58681e03decad6443567d74856dc49153220ed678bbedd247e87de3b54fae" Nov 25 13:08:05 crc kubenswrapper[4693]: I1125 13:08:05.113535 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:08:05 crc kubenswrapper[4693]: I1125 13:08:05.115157 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.270692 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d6lzv"] Nov 25 13:08:23 crc kubenswrapper[4693]: E1125 13:08:23.271728 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caae5a26-497c-443f-8f89-e934009f811a" containerName="extract-content" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.271744 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="caae5a26-497c-443f-8f89-e934009f811a" containerName="extract-content" Nov 25 13:08:23 crc kubenswrapper[4693]: E1125 13:08:23.271762 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caae5a26-497c-443f-8f89-e934009f811a" containerName="registry-server" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.271769 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="caae5a26-497c-443f-8f89-e934009f811a" containerName="registry-server" Nov 25 13:08:23 crc kubenswrapper[4693]: E1125 13:08:23.271810 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caae5a26-497c-443f-8f89-e934009f811a" containerName="extract-utilities" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.271818 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="caae5a26-497c-443f-8f89-e934009f811a" containerName="extract-utilities" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.272012 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="caae5a26-497c-443f-8f89-e934009f811a" containerName="registry-server" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.273730 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.291305 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d6lzv"] Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.421606 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bkkj\" (UniqueName: \"kubernetes.io/projected/c581bf0a-3326-4d07-8162-f4a9eb5115ad-kube-api-access-4bkkj\") pod \"certified-operators-d6lzv\" (UID: \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\") " pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.421670 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c581bf0a-3326-4d07-8162-f4a9eb5115ad-utilities\") pod \"certified-operators-d6lzv\" (UID: \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\") " pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.421692 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c581bf0a-3326-4d07-8162-f4a9eb5115ad-catalog-content\") pod \"certified-operators-d6lzv\" (UID: \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\") " pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.523306 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bkkj\" (UniqueName: \"kubernetes.io/projected/c581bf0a-3326-4d07-8162-f4a9eb5115ad-kube-api-access-4bkkj\") pod \"certified-operators-d6lzv\" (UID: \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\") " pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.523416 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c581bf0a-3326-4d07-8162-f4a9eb5115ad-utilities\") pod \"certified-operators-d6lzv\" (UID: \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\") " pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.523459 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c581bf0a-3326-4d07-8162-f4a9eb5115ad-catalog-content\") pod \"certified-operators-d6lzv\" (UID: \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\") " pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.524021 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c581bf0a-3326-4d07-8162-f4a9eb5115ad-utilities\") pod \"certified-operators-d6lzv\" (UID: \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\") " pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.524107 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c581bf0a-3326-4d07-8162-f4a9eb5115ad-catalog-content\") pod \"certified-operators-d6lzv\" (UID: \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\") " pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.550776 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bkkj\" (UniqueName: \"kubernetes.io/projected/c581bf0a-3326-4d07-8162-f4a9eb5115ad-kube-api-access-4bkkj\") pod \"certified-operators-d6lzv\" (UID: \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\") " pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:23 crc kubenswrapper[4693]: I1125 13:08:23.593977 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:24 crc kubenswrapper[4693]: I1125 13:08:24.089619 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d6lzv"] Nov 25 13:08:24 crc kubenswrapper[4693]: I1125 13:08:24.696839 4693 generic.go:334] "Generic (PLEG): container finished" podID="c581bf0a-3326-4d07-8162-f4a9eb5115ad" containerID="28990e9b2de7de07313c1af2f786feb379fb103845a4f3fb46fd03b19c791c3e" exitCode=0 Nov 25 13:08:24 crc kubenswrapper[4693]: I1125 13:08:24.696976 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d6lzv" event={"ID":"c581bf0a-3326-4d07-8162-f4a9eb5115ad","Type":"ContainerDied","Data":"28990e9b2de7de07313c1af2f786feb379fb103845a4f3fb46fd03b19c791c3e"} Nov 25 13:08:24 crc kubenswrapper[4693]: I1125 13:08:24.697291 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d6lzv" event={"ID":"c581bf0a-3326-4d07-8162-f4a9eb5115ad","Type":"ContainerStarted","Data":"2b89b06cd8b0b34b5f60206edc4fd868c4b88303c3f921e09ff6b226a4844b5c"} Nov 25 13:08:25 crc kubenswrapper[4693]: I1125 13:08:25.731021 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d6lzv" event={"ID":"c581bf0a-3326-4d07-8162-f4a9eb5115ad","Type":"ContainerStarted","Data":"a60797affc5d26061043144a083f07ca6ceb765b551909884708b2c784c4a270"} Nov 25 13:08:26 crc kubenswrapper[4693]: I1125 13:08:26.745285 4693 generic.go:334] "Generic (PLEG): container finished" podID="c581bf0a-3326-4d07-8162-f4a9eb5115ad" containerID="a60797affc5d26061043144a083f07ca6ceb765b551909884708b2c784c4a270" exitCode=0 Nov 25 13:08:26 crc kubenswrapper[4693]: I1125 13:08:26.745342 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d6lzv" event={"ID":"c581bf0a-3326-4d07-8162-f4a9eb5115ad","Type":"ContainerDied","Data":"a60797affc5d26061043144a083f07ca6ceb765b551909884708b2c784c4a270"} Nov 25 13:08:28 crc kubenswrapper[4693]: I1125 13:08:28.770978 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d6lzv" event={"ID":"c581bf0a-3326-4d07-8162-f4a9eb5115ad","Type":"ContainerStarted","Data":"1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88"} Nov 25 13:08:28 crc kubenswrapper[4693]: I1125 13:08:28.792222 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d6lzv" podStartSLOduration=2.683525375 podStartE2EDuration="5.79219878s" podCreationTimestamp="2025-11-25 13:08:23 +0000 UTC" firstStartedPulling="2025-11-25 13:08:24.699659106 +0000 UTC m=+3624.617744487" lastFinishedPulling="2025-11-25 13:08:27.808332471 +0000 UTC m=+3627.726417892" observedRunningTime="2025-11-25 13:08:28.789119337 +0000 UTC m=+3628.707204718" watchObservedRunningTime="2025-11-25 13:08:28.79219878 +0000 UTC m=+3628.710284181" Nov 25 13:08:33 crc kubenswrapper[4693]: I1125 13:08:33.594763 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:33 crc kubenswrapper[4693]: I1125 13:08:33.595349 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:33 crc kubenswrapper[4693]: I1125 13:08:33.660048 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:33 crc kubenswrapper[4693]: I1125 13:08:33.856447 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:33 crc kubenswrapper[4693]: I1125 13:08:33.917281 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d6lzv"] Nov 25 13:08:35 crc kubenswrapper[4693]: I1125 13:08:35.114064 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:08:35 crc kubenswrapper[4693]: I1125 13:08:35.114137 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:08:35 crc kubenswrapper[4693]: I1125 13:08:35.831429 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d6lzv" podUID="c581bf0a-3326-4d07-8162-f4a9eb5115ad" containerName="registry-server" containerID="cri-o://1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88" gracePeriod=2 Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.401537 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.572742 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c581bf0a-3326-4d07-8162-f4a9eb5115ad-catalog-content\") pod \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\" (UID: \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\") " Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.572799 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bkkj\" (UniqueName: \"kubernetes.io/projected/c581bf0a-3326-4d07-8162-f4a9eb5115ad-kube-api-access-4bkkj\") pod \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\" (UID: \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\") " Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.573050 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c581bf0a-3326-4d07-8162-f4a9eb5115ad-utilities\") pod \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\" (UID: \"c581bf0a-3326-4d07-8162-f4a9eb5115ad\") " Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.574235 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c581bf0a-3326-4d07-8162-f4a9eb5115ad-utilities" (OuterVolumeSpecName: "utilities") pod "c581bf0a-3326-4d07-8162-f4a9eb5115ad" (UID: "c581bf0a-3326-4d07-8162-f4a9eb5115ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.579320 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c581bf0a-3326-4d07-8162-f4a9eb5115ad-kube-api-access-4bkkj" (OuterVolumeSpecName: "kube-api-access-4bkkj") pod "c581bf0a-3326-4d07-8162-f4a9eb5115ad" (UID: "c581bf0a-3326-4d07-8162-f4a9eb5115ad"). InnerVolumeSpecName "kube-api-access-4bkkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.615085 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c581bf0a-3326-4d07-8162-f4a9eb5115ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c581bf0a-3326-4d07-8162-f4a9eb5115ad" (UID: "c581bf0a-3326-4d07-8162-f4a9eb5115ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.676799 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c581bf0a-3326-4d07-8162-f4a9eb5115ad-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.676864 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c581bf0a-3326-4d07-8162-f4a9eb5115ad-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.676899 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bkkj\" (UniqueName: \"kubernetes.io/projected/c581bf0a-3326-4d07-8162-f4a9eb5115ad-kube-api-access-4bkkj\") on node \"crc\" DevicePath \"\"" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.843875 4693 generic.go:334] "Generic (PLEG): container finished" podID="c581bf0a-3326-4d07-8162-f4a9eb5115ad" containerID="1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88" exitCode=0 Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.843944 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d6lzv" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.843953 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d6lzv" event={"ID":"c581bf0a-3326-4d07-8162-f4a9eb5115ad","Type":"ContainerDied","Data":"1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88"} Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.844040 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d6lzv" event={"ID":"c581bf0a-3326-4d07-8162-f4a9eb5115ad","Type":"ContainerDied","Data":"2b89b06cd8b0b34b5f60206edc4fd868c4b88303c3f921e09ff6b226a4844b5c"} Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.844090 4693 scope.go:117] "RemoveContainer" containerID="1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.884157 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d6lzv"] Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.890581 4693 scope.go:117] "RemoveContainer" containerID="a60797affc5d26061043144a083f07ca6ceb765b551909884708b2c784c4a270" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.896715 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d6lzv"] Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.914879 4693 scope.go:117] "RemoveContainer" containerID="28990e9b2de7de07313c1af2f786feb379fb103845a4f3fb46fd03b19c791c3e" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.970047 4693 scope.go:117] "RemoveContainer" containerID="1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88" Nov 25 13:08:36 crc kubenswrapper[4693]: E1125 13:08:36.970456 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88\": container with ID starting with 1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88 not found: ID does not exist" containerID="1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.970529 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88"} err="failed to get container status \"1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88\": rpc error: code = NotFound desc = could not find container \"1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88\": container with ID starting with 1d59577aa8a0fb254d599de7fe329708815be534194fcc1f6f63826aeb4dac88 not found: ID does not exist" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.970576 4693 scope.go:117] "RemoveContainer" containerID="a60797affc5d26061043144a083f07ca6ceb765b551909884708b2c784c4a270" Nov 25 13:08:36 crc kubenswrapper[4693]: E1125 13:08:36.970885 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60797affc5d26061043144a083f07ca6ceb765b551909884708b2c784c4a270\": container with ID starting with a60797affc5d26061043144a083f07ca6ceb765b551909884708b2c784c4a270 not found: ID does not exist" containerID="a60797affc5d26061043144a083f07ca6ceb765b551909884708b2c784c4a270" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.970940 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60797affc5d26061043144a083f07ca6ceb765b551909884708b2c784c4a270"} err="failed to get container status \"a60797affc5d26061043144a083f07ca6ceb765b551909884708b2c784c4a270\": rpc error: code = NotFound desc = could not find container \"a60797affc5d26061043144a083f07ca6ceb765b551909884708b2c784c4a270\": container with ID starting with a60797affc5d26061043144a083f07ca6ceb765b551909884708b2c784c4a270 not found: ID does not exist" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.970973 4693 scope.go:117] "RemoveContainer" containerID="28990e9b2de7de07313c1af2f786feb379fb103845a4f3fb46fd03b19c791c3e" Nov 25 13:08:36 crc kubenswrapper[4693]: E1125 13:08:36.971227 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28990e9b2de7de07313c1af2f786feb379fb103845a4f3fb46fd03b19c791c3e\": container with ID starting with 28990e9b2de7de07313c1af2f786feb379fb103845a4f3fb46fd03b19c791c3e not found: ID does not exist" containerID="28990e9b2de7de07313c1af2f786feb379fb103845a4f3fb46fd03b19c791c3e" Nov 25 13:08:36 crc kubenswrapper[4693]: I1125 13:08:36.971257 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28990e9b2de7de07313c1af2f786feb379fb103845a4f3fb46fd03b19c791c3e"} err="failed to get container status \"28990e9b2de7de07313c1af2f786feb379fb103845a4f3fb46fd03b19c791c3e\": rpc error: code = NotFound desc = could not find container \"28990e9b2de7de07313c1af2f786feb379fb103845a4f3fb46fd03b19c791c3e\": container with ID starting with 28990e9b2de7de07313c1af2f786feb379fb103845a4f3fb46fd03b19c791c3e not found: ID does not exist" Nov 25 13:08:38 crc kubenswrapper[4693]: I1125 13:08:38.828283 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c581bf0a-3326-4d07-8162-f4a9eb5115ad" path="/var/lib/kubelet/pods/c581bf0a-3326-4d07-8162-f4a9eb5115ad/volumes" Nov 25 13:09:05 crc kubenswrapper[4693]: I1125 13:09:05.113935 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:09:05 crc kubenswrapper[4693]: I1125 13:09:05.114460 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:09:05 crc kubenswrapper[4693]: I1125 13:09:05.114519 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 13:09:05 crc kubenswrapper[4693]: I1125 13:09:05.115400 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 13:09:05 crc kubenswrapper[4693]: I1125 13:09:05.115458 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" gracePeriod=600 Nov 25 13:09:05 crc kubenswrapper[4693]: E1125 13:09:05.240136 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:09:06 crc kubenswrapper[4693]: I1125 13:09:06.136793 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" exitCode=0 Nov 25 13:09:06 crc kubenswrapper[4693]: I1125 13:09:06.136857 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535"} Nov 25 13:09:06 crc kubenswrapper[4693]: I1125 13:09:06.136899 4693 scope.go:117] "RemoveContainer" containerID="0788863ba245e40e8c7c4d3419b23c74da4cee4596a6c4576579fcfae24fd5a8" Nov 25 13:09:06 crc kubenswrapper[4693]: I1125 13:09:06.137747 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:09:06 crc kubenswrapper[4693]: E1125 13:09:06.138211 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:09:17 crc kubenswrapper[4693]: I1125 13:09:17.813290 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:09:17 crc kubenswrapper[4693]: E1125 13:09:17.814142 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:09:32 crc kubenswrapper[4693]: I1125 13:09:32.812724 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:09:32 crc kubenswrapper[4693]: E1125 13:09:32.813523 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:09:46 crc kubenswrapper[4693]: I1125 13:09:46.813298 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:09:46 crc kubenswrapper[4693]: E1125 13:09:46.814120 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:10:01 crc kubenswrapper[4693]: I1125 13:10:01.813695 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:10:01 crc kubenswrapper[4693]: E1125 13:10:01.815546 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:10:12 crc kubenswrapper[4693]: I1125 13:10:12.813110 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:10:12 crc kubenswrapper[4693]: E1125 13:10:12.815085 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:10:25 crc kubenswrapper[4693]: I1125 13:10:25.813145 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:10:25 crc kubenswrapper[4693]: E1125 13:10:25.814184 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:10:38 crc kubenswrapper[4693]: I1125 13:10:38.814808 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:10:38 crc kubenswrapper[4693]: E1125 13:10:38.815751 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:10:41 crc kubenswrapper[4693]: I1125 13:10:41.029846 4693 generic.go:334] "Generic (PLEG): container finished" podID="9b66ebc6-a0f0-4418-8d28-7364b1f5d177" containerID="42d44e4c360fd562bc80663098d25332a1f176691a4a88b1f0ffd0871fe51fd9" exitCode=0 Nov 25 13:10:41 crc kubenswrapper[4693]: I1125 13:10:41.029952 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9b66ebc6-a0f0-4418-8d28-7364b1f5d177","Type":"ContainerDied","Data":"42d44e4c360fd562bc80663098d25332a1f176691a4a88b1f0ffd0871fe51fd9"} Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.487497 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.643826 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-config-data\") pod \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.644118 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxcfl\" (UniqueName: \"kubernetes.io/projected/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-kube-api-access-sxcfl\") pod \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.644175 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-ssh-key\") pod \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.644215 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-ca-certs\") pod \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.644296 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.644315 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-test-operator-ephemeral-workdir\") pod \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.644366 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-openstack-config\") pod \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.644437 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-test-operator-ephemeral-temporary\") pod \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.644459 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-openstack-config-secret\") pod \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\" (UID: \"9b66ebc6-a0f0-4418-8d28-7364b1f5d177\") " Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.644982 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-config-data" (OuterVolumeSpecName: "config-data") pod "9b66ebc6-a0f0-4418-8d28-7364b1f5d177" (UID: "9b66ebc6-a0f0-4418-8d28-7364b1f5d177"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.645971 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "9b66ebc6-a0f0-4418-8d28-7364b1f5d177" (UID: "9b66ebc6-a0f0-4418-8d28-7364b1f5d177"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.651535 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-kube-api-access-sxcfl" (OuterVolumeSpecName: "kube-api-access-sxcfl") pod "9b66ebc6-a0f0-4418-8d28-7364b1f5d177" (UID: "9b66ebc6-a0f0-4418-8d28-7364b1f5d177"). InnerVolumeSpecName "kube-api-access-sxcfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.655599 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "9b66ebc6-a0f0-4418-8d28-7364b1f5d177" (UID: "9b66ebc6-a0f0-4418-8d28-7364b1f5d177"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.677467 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "9b66ebc6-a0f0-4418-8d28-7364b1f5d177" (UID: "9b66ebc6-a0f0-4418-8d28-7364b1f5d177"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.679136 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9b66ebc6-a0f0-4418-8d28-7364b1f5d177" (UID: "9b66ebc6-a0f0-4418-8d28-7364b1f5d177"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.682082 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "9b66ebc6-a0f0-4418-8d28-7364b1f5d177" (UID: "9b66ebc6-a0f0-4418-8d28-7364b1f5d177"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.693199 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b66ebc6-a0f0-4418-8d28-7364b1f5d177" (UID: "9b66ebc6-a0f0-4418-8d28-7364b1f5d177"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.701997 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9b66ebc6-a0f0-4418-8d28-7364b1f5d177" (UID: "9b66ebc6-a0f0-4418-8d28-7364b1f5d177"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.746500 4693 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-ca-certs\") on node \"crc\" DevicePath \"\"" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.746546 4693 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.746558 4693 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.746568 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.746578 4693 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.746588 4693 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.746597 4693 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-config-data\") on node \"crc\" DevicePath \"\"" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.746605 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxcfl\" (UniqueName: \"kubernetes.io/projected/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-kube-api-access-sxcfl\") on node \"crc\" DevicePath \"\"" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.746613 4693 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b66ebc6-a0f0-4418-8d28-7364b1f5d177-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.772757 4693 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 25 13:10:42 crc kubenswrapper[4693]: I1125 13:10:42.848003 4693 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 25 13:10:43 crc kubenswrapper[4693]: I1125 13:10:43.050854 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9b66ebc6-a0f0-4418-8d28-7364b1f5d177","Type":"ContainerDied","Data":"e4791e8e84ec354735e0ece1b0e43d9cfa2854f79497283aa031b7ca7e35b67f"} Nov 25 13:10:43 crc kubenswrapper[4693]: I1125 13:10:43.050904 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4791e8e84ec354735e0ece1b0e43d9cfa2854f79497283aa031b7ca7e35b67f" Nov 25 13:10:43 crc kubenswrapper[4693]: I1125 13:10:43.051024 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 25 13:10:50 crc kubenswrapper[4693]: I1125 13:10:50.824132 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:10:50 crc kubenswrapper[4693]: E1125 13:10:50.825134 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.010856 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 25 13:10:53 crc kubenswrapper[4693]: E1125 13:10:53.012421 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c581bf0a-3326-4d07-8162-f4a9eb5115ad" containerName="extract-content" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.012445 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c581bf0a-3326-4d07-8162-f4a9eb5115ad" containerName="extract-content" Nov 25 13:10:53 crc kubenswrapper[4693]: E1125 13:10:53.012464 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b66ebc6-a0f0-4418-8d28-7364b1f5d177" containerName="tempest-tests-tempest-tests-runner" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.012473 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b66ebc6-a0f0-4418-8d28-7364b1f5d177" containerName="tempest-tests-tempest-tests-runner" Nov 25 13:10:53 crc kubenswrapper[4693]: E1125 13:10:53.012492 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c581bf0a-3326-4d07-8162-f4a9eb5115ad" containerName="registry-server" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.012498 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c581bf0a-3326-4d07-8162-f4a9eb5115ad" containerName="registry-server" Nov 25 13:10:53 crc kubenswrapper[4693]: E1125 13:10:53.012514 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c581bf0a-3326-4d07-8162-f4a9eb5115ad" containerName="extract-utilities" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.012520 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="c581bf0a-3326-4d07-8162-f4a9eb5115ad" containerName="extract-utilities" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.012741 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="c581bf0a-3326-4d07-8162-f4a9eb5115ad" containerName="registry-server" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.012752 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b66ebc6-a0f0-4418-8d28-7364b1f5d177" containerName="tempest-tests-tempest-tests-runner" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.013679 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.017600 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qt8mw" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.024833 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.147627 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzc98\" (UniqueName: \"kubernetes.io/projected/54397ebd-dc91-441f-9c68-261a1c952589-kube-api-access-rzc98\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54397ebd-dc91-441f-9c68-261a1c952589\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.147679 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54397ebd-dc91-441f-9c68-261a1c952589\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.249432 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzc98\" (UniqueName: \"kubernetes.io/projected/54397ebd-dc91-441f-9c68-261a1c952589-kube-api-access-rzc98\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54397ebd-dc91-441f-9c68-261a1c952589\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.249489 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54397ebd-dc91-441f-9c68-261a1c952589\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.250203 4693 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54397ebd-dc91-441f-9c68-261a1c952589\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.270604 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzc98\" (UniqueName: \"kubernetes.io/projected/54397ebd-dc91-441f-9c68-261a1c952589-kube-api-access-rzc98\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54397ebd-dc91-441f-9c68-261a1c952589\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.312958 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"54397ebd-dc91-441f-9c68-261a1c952589\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.343872 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.833694 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 25 13:10:53 crc kubenswrapper[4693]: I1125 13:10:53.846646 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 13:10:54 crc kubenswrapper[4693]: I1125 13:10:54.152838 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"54397ebd-dc91-441f-9c68-261a1c952589","Type":"ContainerStarted","Data":"c90097da74d1b9b6b6e632ba03a27c2c3c587b8eb97c9dddf1e526b90ee8d8fc"} Nov 25 13:10:56 crc kubenswrapper[4693]: I1125 13:10:56.177768 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"54397ebd-dc91-441f-9c68-261a1c952589","Type":"ContainerStarted","Data":"c6df19d2cee7a6419c9a4b8025c7a15512ef14c49a195717173a97229224c95b"} Nov 25 13:10:56 crc kubenswrapper[4693]: I1125 13:10:56.201232 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.0978324329999998 podStartE2EDuration="4.201211236s" podCreationTimestamp="2025-11-25 13:10:52 +0000 UTC" firstStartedPulling="2025-11-25 13:10:53.846246783 +0000 UTC m=+3773.764332194" lastFinishedPulling="2025-11-25 13:10:54.949625596 +0000 UTC m=+3774.867710997" observedRunningTime="2025-11-25 13:10:56.190187477 +0000 UTC m=+3776.108272858" watchObservedRunningTime="2025-11-25 13:10:56.201211236 +0000 UTC m=+3776.119296627" Nov 25 13:11:01 crc kubenswrapper[4693]: I1125 13:11:01.813796 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:11:01 crc kubenswrapper[4693]: E1125 13:11:01.814573 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:11:12 crc kubenswrapper[4693]: I1125 13:11:12.812962 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:11:12 crc kubenswrapper[4693]: E1125 13:11:12.813812 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.026476 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rt9rp"] Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.029134 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.037636 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rt9rp"] Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.162877 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a31b362f-c747-4bf0-bcce-27a2761b95e6-utilities\") pod \"redhat-operators-rt9rp\" (UID: \"a31b362f-c747-4bf0-bcce-27a2761b95e6\") " pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.163190 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr9zx\" (UniqueName: \"kubernetes.io/projected/a31b362f-c747-4bf0-bcce-27a2761b95e6-kube-api-access-rr9zx\") pod \"redhat-operators-rt9rp\" (UID: \"a31b362f-c747-4bf0-bcce-27a2761b95e6\") " pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.163233 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a31b362f-c747-4bf0-bcce-27a2761b95e6-catalog-content\") pod \"redhat-operators-rt9rp\" (UID: \"a31b362f-c747-4bf0-bcce-27a2761b95e6\") " pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.265234 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr9zx\" (UniqueName: \"kubernetes.io/projected/a31b362f-c747-4bf0-bcce-27a2761b95e6-kube-api-access-rr9zx\") pod \"redhat-operators-rt9rp\" (UID: \"a31b362f-c747-4bf0-bcce-27a2761b95e6\") " pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.265287 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a31b362f-c747-4bf0-bcce-27a2761b95e6-catalog-content\") pod \"redhat-operators-rt9rp\" (UID: \"a31b362f-c747-4bf0-bcce-27a2761b95e6\") " pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.265440 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a31b362f-c747-4bf0-bcce-27a2761b95e6-utilities\") pod \"redhat-operators-rt9rp\" (UID: \"a31b362f-c747-4bf0-bcce-27a2761b95e6\") " pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.265948 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a31b362f-c747-4bf0-bcce-27a2761b95e6-utilities\") pod \"redhat-operators-rt9rp\" (UID: \"a31b362f-c747-4bf0-bcce-27a2761b95e6\") " pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.266136 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a31b362f-c747-4bf0-bcce-27a2761b95e6-catalog-content\") pod \"redhat-operators-rt9rp\" (UID: \"a31b362f-c747-4bf0-bcce-27a2761b95e6\") " pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.290301 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr9zx\" (UniqueName: \"kubernetes.io/projected/a31b362f-c747-4bf0-bcce-27a2761b95e6-kube-api-access-rr9zx\") pod \"redhat-operators-rt9rp\" (UID: \"a31b362f-c747-4bf0-bcce-27a2761b95e6\") " pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.366966 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:13 crc kubenswrapper[4693]: I1125 13:11:13.884725 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rt9rp"] Nov 25 13:11:14 crc kubenswrapper[4693]: I1125 13:11:14.365911 4693 generic.go:334] "Generic (PLEG): container finished" podID="a31b362f-c747-4bf0-bcce-27a2761b95e6" containerID="e1c4dc559fa1c85c432e9ede98c484c6205ff84a41c41c6a5123d8bec4fde1b7" exitCode=0 Nov 25 13:11:14 crc kubenswrapper[4693]: I1125 13:11:14.365950 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rt9rp" event={"ID":"a31b362f-c747-4bf0-bcce-27a2761b95e6","Type":"ContainerDied","Data":"e1c4dc559fa1c85c432e9ede98c484c6205ff84a41c41c6a5123d8bec4fde1b7"} Nov 25 13:11:14 crc kubenswrapper[4693]: I1125 13:11:14.366171 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rt9rp" event={"ID":"a31b362f-c747-4bf0-bcce-27a2761b95e6","Type":"ContainerStarted","Data":"df3552c8e49122843b4002b8d862a7f0f63c887ec14e525c3c1b21f2cb9782e0"} Nov 25 13:11:18 crc kubenswrapper[4693]: I1125 13:11:18.764173 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8svzn/must-gather-zvnp5"] Nov 25 13:11:18 crc kubenswrapper[4693]: I1125 13:11:18.766760 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/must-gather-zvnp5" Nov 25 13:11:18 crc kubenswrapper[4693]: I1125 13:11:18.769971 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8svzn"/"openshift-service-ca.crt" Nov 25 13:11:18 crc kubenswrapper[4693]: I1125 13:11:18.770110 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8svzn"/"default-dockercfg-d2th6" Nov 25 13:11:18 crc kubenswrapper[4693]: I1125 13:11:18.770272 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8svzn"/"kube-root-ca.crt" Nov 25 13:11:18 crc kubenswrapper[4693]: I1125 13:11:18.778341 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8svzn/must-gather-zvnp5"] Nov 25 13:11:18 crc kubenswrapper[4693]: I1125 13:11:18.801838 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbxm\" (UniqueName: \"kubernetes.io/projected/dc8f0717-6d45-4672-b4c6-3fb3011972eb-kube-api-access-fzbxm\") pod \"must-gather-zvnp5\" (UID: \"dc8f0717-6d45-4672-b4c6-3fb3011972eb\") " pod="openshift-must-gather-8svzn/must-gather-zvnp5" Nov 25 13:11:18 crc kubenswrapper[4693]: I1125 13:11:18.801880 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dc8f0717-6d45-4672-b4c6-3fb3011972eb-must-gather-output\") pod \"must-gather-zvnp5\" (UID: \"dc8f0717-6d45-4672-b4c6-3fb3011972eb\") " pod="openshift-must-gather-8svzn/must-gather-zvnp5" Nov 25 13:11:18 crc kubenswrapper[4693]: I1125 13:11:18.903706 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbxm\" (UniqueName: \"kubernetes.io/projected/dc8f0717-6d45-4672-b4c6-3fb3011972eb-kube-api-access-fzbxm\") pod \"must-gather-zvnp5\" (UID: \"dc8f0717-6d45-4672-b4c6-3fb3011972eb\") " pod="openshift-must-gather-8svzn/must-gather-zvnp5" Nov 25 13:11:18 crc kubenswrapper[4693]: I1125 13:11:18.903772 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dc8f0717-6d45-4672-b4c6-3fb3011972eb-must-gather-output\") pod \"must-gather-zvnp5\" (UID: \"dc8f0717-6d45-4672-b4c6-3fb3011972eb\") " pod="openshift-must-gather-8svzn/must-gather-zvnp5" Nov 25 13:11:18 crc kubenswrapper[4693]: I1125 13:11:18.904384 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dc8f0717-6d45-4672-b4c6-3fb3011972eb-must-gather-output\") pod \"must-gather-zvnp5\" (UID: \"dc8f0717-6d45-4672-b4c6-3fb3011972eb\") " pod="openshift-must-gather-8svzn/must-gather-zvnp5" Nov 25 13:11:18 crc kubenswrapper[4693]: I1125 13:11:18.931290 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbxm\" (UniqueName: \"kubernetes.io/projected/dc8f0717-6d45-4672-b4c6-3fb3011972eb-kube-api-access-fzbxm\") pod \"must-gather-zvnp5\" (UID: \"dc8f0717-6d45-4672-b4c6-3fb3011972eb\") " pod="openshift-must-gather-8svzn/must-gather-zvnp5" Nov 25 13:11:19 crc kubenswrapper[4693]: I1125 13:11:19.087732 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/must-gather-zvnp5" Nov 25 13:11:24 crc kubenswrapper[4693]: I1125 13:11:24.812854 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:11:24 crc kubenswrapper[4693]: E1125 13:11:24.813573 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:11:26 crc kubenswrapper[4693]: I1125 13:11:26.251032 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8svzn/must-gather-zvnp5"] Nov 25 13:11:26 crc kubenswrapper[4693]: W1125 13:11:26.257840 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc8f0717_6d45_4672_b4c6_3fb3011972eb.slice/crio-c2a0cc359ce88dab78cf77cbb057c64ae078559c46496a3be2b7e3db24397d39 WatchSource:0}: Error finding container c2a0cc359ce88dab78cf77cbb057c64ae078559c46496a3be2b7e3db24397d39: Status 404 returned error can't find the container with id c2a0cc359ce88dab78cf77cbb057c64ae078559c46496a3be2b7e3db24397d39 Nov 25 13:11:26 crc kubenswrapper[4693]: I1125 13:11:26.486548 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8svzn/must-gather-zvnp5" event={"ID":"dc8f0717-6d45-4672-b4c6-3fb3011972eb","Type":"ContainerStarted","Data":"c2a0cc359ce88dab78cf77cbb057c64ae078559c46496a3be2b7e3db24397d39"} Nov 25 13:11:27 crc kubenswrapper[4693]: I1125 13:11:27.497193 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rt9rp" event={"ID":"a31b362f-c747-4bf0-bcce-27a2761b95e6","Type":"ContainerStarted","Data":"ced000cdf88a43fc39ada949737e56c3a5624fdb166010f74393bee997438cb9"} Nov 25 13:11:28 crc kubenswrapper[4693]: I1125 13:11:28.509599 4693 generic.go:334] "Generic (PLEG): container finished" podID="a31b362f-c747-4bf0-bcce-27a2761b95e6" containerID="ced000cdf88a43fc39ada949737e56c3a5624fdb166010f74393bee997438cb9" exitCode=0 Nov 25 13:11:28 crc kubenswrapper[4693]: I1125 13:11:28.509683 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rt9rp" event={"ID":"a31b362f-c747-4bf0-bcce-27a2761b95e6","Type":"ContainerDied","Data":"ced000cdf88a43fc39ada949737e56c3a5624fdb166010f74393bee997438cb9"} Nov 25 13:11:35 crc kubenswrapper[4693]: I1125 13:11:35.814181 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:11:35 crc kubenswrapper[4693]: E1125 13:11:35.815579 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:11:37 crc kubenswrapper[4693]: I1125 13:11:37.619050 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rt9rp" event={"ID":"a31b362f-c747-4bf0-bcce-27a2761b95e6","Type":"ContainerStarted","Data":"c761c36bf8f092a8aef0c0f1d79191d1a8e23c957a2a2d04240ef1e98ce844c6"} Nov 25 13:11:37 crc kubenswrapper[4693]: I1125 13:11:37.667458 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rt9rp" podStartSLOduration=9.175056295 podStartE2EDuration="25.667436433s" podCreationTimestamp="2025-11-25 13:11:12 +0000 UTC" firstStartedPulling="2025-11-25 13:11:14.367708786 +0000 UTC m=+3794.285794167" lastFinishedPulling="2025-11-25 13:11:30.860088904 +0000 UTC m=+3810.778174305" observedRunningTime="2025-11-25 13:11:37.637635126 +0000 UTC m=+3817.555720537" watchObservedRunningTime="2025-11-25 13:11:37.667436433 +0000 UTC m=+3817.585521814" Nov 25 13:11:38 crc kubenswrapper[4693]: I1125 13:11:38.631828 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8svzn/must-gather-zvnp5" event={"ID":"dc8f0717-6d45-4672-b4c6-3fb3011972eb","Type":"ContainerStarted","Data":"b061556706b428a143b7a6fc7f15541d2232190b40849002afa0b538843f15f5"} Nov 25 13:11:38 crc kubenswrapper[4693]: I1125 13:11:38.632458 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8svzn/must-gather-zvnp5" event={"ID":"dc8f0717-6d45-4672-b4c6-3fb3011972eb","Type":"ContainerStarted","Data":"5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e"} Nov 25 13:11:38 crc kubenswrapper[4693]: I1125 13:11:38.657221 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8svzn/must-gather-zvnp5" podStartSLOduration=9.377098078 podStartE2EDuration="20.657201679s" podCreationTimestamp="2025-11-25 13:11:18 +0000 UTC" firstStartedPulling="2025-11-25 13:11:26.260313632 +0000 UTC m=+3806.178399013" lastFinishedPulling="2025-11-25 13:11:37.540417233 +0000 UTC m=+3817.458502614" observedRunningTime="2025-11-25 13:11:38.647912588 +0000 UTC m=+3818.565998009" watchObservedRunningTime="2025-11-25 13:11:38.657201679 +0000 UTC m=+3818.575287060" Nov 25 13:11:42 crc kubenswrapper[4693]: I1125 13:11:42.110514 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8svzn/crc-debug-rcn7p"] Nov 25 13:11:42 crc kubenswrapper[4693]: I1125 13:11:42.112496 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/crc-debug-rcn7p" Nov 25 13:11:42 crc kubenswrapper[4693]: I1125 13:11:42.123895 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c54b010-339a-4999-ac4a-6ea1396c19ba-host\") pod \"crc-debug-rcn7p\" (UID: \"5c54b010-339a-4999-ac4a-6ea1396c19ba\") " pod="openshift-must-gather-8svzn/crc-debug-rcn7p" Nov 25 13:11:42 crc kubenswrapper[4693]: I1125 13:11:42.124149 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4gp9\" (UniqueName: \"kubernetes.io/projected/5c54b010-339a-4999-ac4a-6ea1396c19ba-kube-api-access-p4gp9\") pod \"crc-debug-rcn7p\" (UID: \"5c54b010-339a-4999-ac4a-6ea1396c19ba\") " pod="openshift-must-gather-8svzn/crc-debug-rcn7p" Nov 25 13:11:42 crc kubenswrapper[4693]: I1125 13:11:42.224894 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c54b010-339a-4999-ac4a-6ea1396c19ba-host\") pod \"crc-debug-rcn7p\" (UID: \"5c54b010-339a-4999-ac4a-6ea1396c19ba\") " pod="openshift-must-gather-8svzn/crc-debug-rcn7p" Nov 25 13:11:42 crc kubenswrapper[4693]: I1125 13:11:42.224949 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4gp9\" (UniqueName: \"kubernetes.io/projected/5c54b010-339a-4999-ac4a-6ea1396c19ba-kube-api-access-p4gp9\") pod \"crc-debug-rcn7p\" (UID: \"5c54b010-339a-4999-ac4a-6ea1396c19ba\") " pod="openshift-must-gather-8svzn/crc-debug-rcn7p" Nov 25 13:11:42 crc kubenswrapper[4693]: I1125 13:11:42.225346 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c54b010-339a-4999-ac4a-6ea1396c19ba-host\") pod \"crc-debug-rcn7p\" (UID: \"5c54b010-339a-4999-ac4a-6ea1396c19ba\") " pod="openshift-must-gather-8svzn/crc-debug-rcn7p" Nov 25 13:11:42 crc kubenswrapper[4693]: I1125 13:11:42.244072 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4gp9\" (UniqueName: \"kubernetes.io/projected/5c54b010-339a-4999-ac4a-6ea1396c19ba-kube-api-access-p4gp9\") pod \"crc-debug-rcn7p\" (UID: \"5c54b010-339a-4999-ac4a-6ea1396c19ba\") " pod="openshift-must-gather-8svzn/crc-debug-rcn7p" Nov 25 13:11:42 crc kubenswrapper[4693]: I1125 13:11:42.430254 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/crc-debug-rcn7p" Nov 25 13:11:42 crc kubenswrapper[4693]: W1125 13:11:42.465409 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c54b010_339a_4999_ac4a_6ea1396c19ba.slice/crio-e7c24b9650fbf01c3f84653194b9853d8272a3685bc64eeb5d15e0400de26a42 WatchSource:0}: Error finding container e7c24b9650fbf01c3f84653194b9853d8272a3685bc64eeb5d15e0400de26a42: Status 404 returned error can't find the container with id e7c24b9650fbf01c3f84653194b9853d8272a3685bc64eeb5d15e0400de26a42 Nov 25 13:11:42 crc kubenswrapper[4693]: I1125 13:11:42.671178 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8svzn/crc-debug-rcn7p" event={"ID":"5c54b010-339a-4999-ac4a-6ea1396c19ba","Type":"ContainerStarted","Data":"e7c24b9650fbf01c3f84653194b9853d8272a3685bc64eeb5d15e0400de26a42"} Nov 25 13:11:43 crc kubenswrapper[4693]: I1125 13:11:43.367267 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:43 crc kubenswrapper[4693]: I1125 13:11:43.367653 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:11:44 crc kubenswrapper[4693]: I1125 13:11:44.424665 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rt9rp" podUID="a31b362f-c747-4bf0-bcce-27a2761b95e6" containerName="registry-server" probeResult="failure" output=< Nov 25 13:11:44 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 13:11:44 crc kubenswrapper[4693]: > Nov 25 13:11:47 crc kubenswrapper[4693]: I1125 13:11:47.813570 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:11:47 crc kubenswrapper[4693]: E1125 13:11:47.814054 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:11:54 crc kubenswrapper[4693]: I1125 13:11:54.414642 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rt9rp" podUID="a31b362f-c747-4bf0-bcce-27a2761b95e6" containerName="registry-server" probeResult="failure" output=< Nov 25 13:11:54 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 13:11:54 crc kubenswrapper[4693]: > Nov 25 13:11:56 crc kubenswrapper[4693]: E1125 13:11:56.952161 4693 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Nov 25 13:11:56 crc kubenswrapper[4693]: E1125 13:11:56.952887 4693 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p4gp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-rcn7p_openshift-must-gather-8svzn(5c54b010-339a-4999-ac4a-6ea1396c19ba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 25 13:11:56 crc kubenswrapper[4693]: E1125 13:11:56.954099 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-8svzn/crc-debug-rcn7p" podUID="5c54b010-339a-4999-ac4a-6ea1396c19ba" Nov 25 13:11:57 crc kubenswrapper[4693]: E1125 13:11:57.826103 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-8svzn/crc-debug-rcn7p" podUID="5c54b010-339a-4999-ac4a-6ea1396c19ba" Nov 25 13:11:58 crc kubenswrapper[4693]: I1125 13:11:58.813492 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:11:58 crc kubenswrapper[4693]: E1125 13:11:58.814013 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:12:04 crc kubenswrapper[4693]: I1125 13:12:04.418213 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rt9rp" podUID="a31b362f-c747-4bf0-bcce-27a2761b95e6" containerName="registry-server" probeResult="failure" output=< Nov 25 13:12:04 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 13:12:04 crc kubenswrapper[4693]: > Nov 25 13:12:10 crc kubenswrapper[4693]: I1125 13:12:10.823952 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:12:10 crc kubenswrapper[4693]: E1125 13:12:10.824530 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:12:11 crc kubenswrapper[4693]: I1125 13:12:11.953628 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8svzn/crc-debug-rcn7p" event={"ID":"5c54b010-339a-4999-ac4a-6ea1396c19ba","Type":"ContainerStarted","Data":"3a4a881a2536f6846cfd3f8a213f12618df12241ffe28ada4e136cd44c9a3217"} Nov 25 13:12:14 crc kubenswrapper[4693]: I1125 13:12:14.421244 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rt9rp" podUID="a31b362f-c747-4bf0-bcce-27a2761b95e6" containerName="registry-server" probeResult="failure" output=< Nov 25 13:12:14 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 13:12:14 crc kubenswrapper[4693]: > Nov 25 13:12:24 crc kubenswrapper[4693]: I1125 13:12:24.430441 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rt9rp" podUID="a31b362f-c747-4bf0-bcce-27a2761b95e6" containerName="registry-server" probeResult="failure" output=< Nov 25 13:12:24 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 13:12:24 crc kubenswrapper[4693]: > Nov 25 13:12:25 crc kubenswrapper[4693]: I1125 13:12:25.812645 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:12:25 crc kubenswrapper[4693]: E1125 13:12:25.813619 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:12:34 crc kubenswrapper[4693]: I1125 13:12:34.419151 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rt9rp" podUID="a31b362f-c747-4bf0-bcce-27a2761b95e6" containerName="registry-server" probeResult="failure" output=< Nov 25 13:12:34 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 13:12:34 crc kubenswrapper[4693]: > Nov 25 13:12:37 crc kubenswrapper[4693]: I1125 13:12:37.813606 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:12:37 crc kubenswrapper[4693]: E1125 13:12:37.814121 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:12:43 crc kubenswrapper[4693]: I1125 13:12:43.419864 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:12:43 crc kubenswrapper[4693]: I1125 13:12:43.464037 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8svzn/crc-debug-rcn7p" podStartSLOduration=32.650310726 podStartE2EDuration="1m1.464010231s" podCreationTimestamp="2025-11-25 13:11:42 +0000 UTC" firstStartedPulling="2025-11-25 13:11:42.470542994 +0000 UTC m=+3822.388628375" lastFinishedPulling="2025-11-25 13:12:11.284242499 +0000 UTC m=+3851.202327880" observedRunningTime="2025-11-25 13:12:11.970278176 +0000 UTC m=+3851.888363557" watchObservedRunningTime="2025-11-25 13:12:43.464010231 +0000 UTC m=+3883.382095622" Nov 25 13:12:43 crc kubenswrapper[4693]: I1125 13:12:43.471634 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rt9rp" Nov 25 13:12:44 crc kubenswrapper[4693]: I1125 13:12:44.108556 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rt9rp"] Nov 25 13:12:44 crc kubenswrapper[4693]: I1125 13:12:44.276262 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hrqcz"] Nov 25 13:12:44 crc kubenswrapper[4693]: I1125 13:12:44.276611 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hrqcz" podUID="70c79042-9553-4044-845a-a164e846c298" containerName="registry-server" containerID="cri-o://abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2" gracePeriod=2 Nov 25 13:12:44 crc kubenswrapper[4693]: I1125 13:12:44.868890 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 13:12:44 crc kubenswrapper[4693]: I1125 13:12:44.883010 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c79042-9553-4044-845a-a164e846c298-catalog-content\") pod \"70c79042-9553-4044-845a-a164e846c298\" (UID: \"70c79042-9553-4044-845a-a164e846c298\") " Nov 25 13:12:44 crc kubenswrapper[4693]: I1125 13:12:44.883202 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfg92\" (UniqueName: \"kubernetes.io/projected/70c79042-9553-4044-845a-a164e846c298-kube-api-access-xfg92\") pod \"70c79042-9553-4044-845a-a164e846c298\" (UID: \"70c79042-9553-4044-845a-a164e846c298\") " Nov 25 13:12:44 crc kubenswrapper[4693]: I1125 13:12:44.883238 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c79042-9553-4044-845a-a164e846c298-utilities\") pod \"70c79042-9553-4044-845a-a164e846c298\" (UID: \"70c79042-9553-4044-845a-a164e846c298\") " Nov 25 13:12:44 crc kubenswrapper[4693]: I1125 13:12:44.883659 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c79042-9553-4044-845a-a164e846c298-utilities" (OuterVolumeSpecName: "utilities") pod "70c79042-9553-4044-845a-a164e846c298" (UID: "70c79042-9553-4044-845a-a164e846c298"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:12:44 crc kubenswrapper[4693]: I1125 13:12:44.884949 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70c79042-9553-4044-845a-a164e846c298-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 13:12:44 crc kubenswrapper[4693]: I1125 13:12:44.891279 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70c79042-9553-4044-845a-a164e846c298-kube-api-access-xfg92" (OuterVolumeSpecName: "kube-api-access-xfg92") pod "70c79042-9553-4044-845a-a164e846c298" (UID: "70c79042-9553-4044-845a-a164e846c298"). InnerVolumeSpecName "kube-api-access-xfg92". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:12:44 crc kubenswrapper[4693]: I1125 13:12:44.986195 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfg92\" (UniqueName: \"kubernetes.io/projected/70c79042-9553-4044-845a-a164e846c298-kube-api-access-xfg92\") on node \"crc\" DevicePath \"\"" Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.046967 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70c79042-9553-4044-845a-a164e846c298-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70c79042-9553-4044-845a-a164e846c298" (UID: "70c79042-9553-4044-845a-a164e846c298"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.087986 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70c79042-9553-4044-845a-a164e846c298-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.267857 4693 generic.go:334] "Generic (PLEG): container finished" podID="70c79042-9553-4044-845a-a164e846c298" containerID="abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2" exitCode=0 Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.267928 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrqcz" event={"ID":"70c79042-9553-4044-845a-a164e846c298","Type":"ContainerDied","Data":"abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2"} Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.267994 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hrqcz" event={"ID":"70c79042-9553-4044-845a-a164e846c298","Type":"ContainerDied","Data":"1e109e98ed2e4889b62c7ac2bacdad6997b500c354d50c47bd60fed2389a2528"} Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.268017 4693 scope.go:117] "RemoveContainer" containerID="abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2" Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.267946 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hrqcz" Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.306614 4693 scope.go:117] "RemoveContainer" containerID="1e0c9dd11b4c0e5e911405d4424fb180eaea3d30df69292478dbe22ba83208c8" Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.311303 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hrqcz"] Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.323756 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hrqcz"] Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.336055 4693 scope.go:117] "RemoveContainer" containerID="eb7f80df34a6e6f65f75292a1f6f08b264fe7f52322a3c1d87c013657e7db7b6" Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.377516 4693 scope.go:117] "RemoveContainer" containerID="abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2" Nov 25 13:12:45 crc kubenswrapper[4693]: E1125 13:12:45.378034 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2\": container with ID starting with abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2 not found: ID does not exist" containerID="abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2" Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.378084 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2"} err="failed to get container status \"abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2\": rpc error: code = NotFound desc = could not find container \"abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2\": container with ID starting with abb133c22eb4b018cbd2cef743398042b938483f6c4a58741d35caa435ccd3d2 not found: ID does not exist" Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.378116 4693 scope.go:117] "RemoveContainer" containerID="1e0c9dd11b4c0e5e911405d4424fb180eaea3d30df69292478dbe22ba83208c8" Nov 25 13:12:45 crc kubenswrapper[4693]: E1125 13:12:45.378519 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e0c9dd11b4c0e5e911405d4424fb180eaea3d30df69292478dbe22ba83208c8\": container with ID starting with 1e0c9dd11b4c0e5e911405d4424fb180eaea3d30df69292478dbe22ba83208c8 not found: ID does not exist" containerID="1e0c9dd11b4c0e5e911405d4424fb180eaea3d30df69292478dbe22ba83208c8" Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.378545 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e0c9dd11b4c0e5e911405d4424fb180eaea3d30df69292478dbe22ba83208c8"} err="failed to get container status \"1e0c9dd11b4c0e5e911405d4424fb180eaea3d30df69292478dbe22ba83208c8\": rpc error: code = NotFound desc = could not find container \"1e0c9dd11b4c0e5e911405d4424fb180eaea3d30df69292478dbe22ba83208c8\": container with ID starting with 1e0c9dd11b4c0e5e911405d4424fb180eaea3d30df69292478dbe22ba83208c8 not found: ID does not exist" Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.378558 4693 scope.go:117] "RemoveContainer" containerID="eb7f80df34a6e6f65f75292a1f6f08b264fe7f52322a3c1d87c013657e7db7b6" Nov 25 13:12:45 crc kubenswrapper[4693]: E1125 13:12:45.382645 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb7f80df34a6e6f65f75292a1f6f08b264fe7f52322a3c1d87c013657e7db7b6\": container with ID starting with eb7f80df34a6e6f65f75292a1f6f08b264fe7f52322a3c1d87c013657e7db7b6 not found: ID does not exist" containerID="eb7f80df34a6e6f65f75292a1f6f08b264fe7f52322a3c1d87c013657e7db7b6" Nov 25 13:12:45 crc kubenswrapper[4693]: I1125 13:12:45.382693 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb7f80df34a6e6f65f75292a1f6f08b264fe7f52322a3c1d87c013657e7db7b6"} err="failed to get container status \"eb7f80df34a6e6f65f75292a1f6f08b264fe7f52322a3c1d87c013657e7db7b6\": rpc error: code = NotFound desc = could not find container \"eb7f80df34a6e6f65f75292a1f6f08b264fe7f52322a3c1d87c013657e7db7b6\": container with ID starting with eb7f80df34a6e6f65f75292a1f6f08b264fe7f52322a3c1d87c013657e7db7b6 not found: ID does not exist" Nov 25 13:12:46 crc kubenswrapper[4693]: I1125 13:12:46.827125 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70c79042-9553-4044-845a-a164e846c298" path="/var/lib/kubelet/pods/70c79042-9553-4044-845a-a164e846c298/volumes" Nov 25 13:12:52 crc kubenswrapper[4693]: I1125 13:12:52.820089 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:12:53 crc kubenswrapper[4693]: E1125 13:12:52.820983 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:12:55 crc kubenswrapper[4693]: I1125 13:12:55.376533 4693 generic.go:334] "Generic (PLEG): container finished" podID="5c54b010-339a-4999-ac4a-6ea1396c19ba" containerID="3a4a881a2536f6846cfd3f8a213f12618df12241ffe28ada4e136cd44c9a3217" exitCode=0 Nov 25 13:12:55 crc kubenswrapper[4693]: I1125 13:12:55.376613 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8svzn/crc-debug-rcn7p" event={"ID":"5c54b010-339a-4999-ac4a-6ea1396c19ba","Type":"ContainerDied","Data":"3a4a881a2536f6846cfd3f8a213f12618df12241ffe28ada4e136cd44c9a3217"} Nov 25 13:12:56 crc kubenswrapper[4693]: I1125 13:12:56.483483 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/crc-debug-rcn7p" Nov 25 13:12:56 crc kubenswrapper[4693]: I1125 13:12:56.514245 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4gp9\" (UniqueName: \"kubernetes.io/projected/5c54b010-339a-4999-ac4a-6ea1396c19ba-kube-api-access-p4gp9\") pod \"5c54b010-339a-4999-ac4a-6ea1396c19ba\" (UID: \"5c54b010-339a-4999-ac4a-6ea1396c19ba\") " Nov 25 13:12:56 crc kubenswrapper[4693]: I1125 13:12:56.514412 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c54b010-339a-4999-ac4a-6ea1396c19ba-host\") pod \"5c54b010-339a-4999-ac4a-6ea1396c19ba\" (UID: \"5c54b010-339a-4999-ac4a-6ea1396c19ba\") " Nov 25 13:12:56 crc kubenswrapper[4693]: I1125 13:12:56.514526 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c54b010-339a-4999-ac4a-6ea1396c19ba-host" (OuterVolumeSpecName: "host") pod "5c54b010-339a-4999-ac4a-6ea1396c19ba" (UID: "5c54b010-339a-4999-ac4a-6ea1396c19ba"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 13:12:56 crc kubenswrapper[4693]: I1125 13:12:56.514720 4693 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c54b010-339a-4999-ac4a-6ea1396c19ba-host\") on node \"crc\" DevicePath \"\"" Nov 25 13:12:56 crc kubenswrapper[4693]: I1125 13:12:56.521559 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c54b010-339a-4999-ac4a-6ea1396c19ba-kube-api-access-p4gp9" (OuterVolumeSpecName: "kube-api-access-p4gp9") pod "5c54b010-339a-4999-ac4a-6ea1396c19ba" (UID: "5c54b010-339a-4999-ac4a-6ea1396c19ba"). InnerVolumeSpecName "kube-api-access-p4gp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:12:56 crc kubenswrapper[4693]: I1125 13:12:56.523332 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8svzn/crc-debug-rcn7p"] Nov 25 13:12:56 crc kubenswrapper[4693]: I1125 13:12:56.533199 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8svzn/crc-debug-rcn7p"] Nov 25 13:12:56 crc kubenswrapper[4693]: I1125 13:12:56.617177 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4gp9\" (UniqueName: \"kubernetes.io/projected/5c54b010-339a-4999-ac4a-6ea1396c19ba-kube-api-access-p4gp9\") on node \"crc\" DevicePath \"\"" Nov 25 13:12:56 crc kubenswrapper[4693]: I1125 13:12:56.826026 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c54b010-339a-4999-ac4a-6ea1396c19ba" path="/var/lib/kubelet/pods/5c54b010-339a-4999-ac4a-6ea1396c19ba/volumes" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.398548 4693 scope.go:117] "RemoveContainer" containerID="3a4a881a2536f6846cfd3f8a213f12618df12241ffe28ada4e136cd44c9a3217" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.398564 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/crc-debug-rcn7p" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.695916 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8svzn/crc-debug-h96vq"] Nov 25 13:12:57 crc kubenswrapper[4693]: E1125 13:12:57.696575 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c79042-9553-4044-845a-a164e846c298" containerName="extract-utilities" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.696605 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c79042-9553-4044-845a-a164e846c298" containerName="extract-utilities" Nov 25 13:12:57 crc kubenswrapper[4693]: E1125 13:12:57.696690 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c79042-9553-4044-845a-a164e846c298" containerName="registry-server" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.696712 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c79042-9553-4044-845a-a164e846c298" containerName="registry-server" Nov 25 13:12:57 crc kubenswrapper[4693]: E1125 13:12:57.696745 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c54b010-339a-4999-ac4a-6ea1396c19ba" containerName="container-00" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.696763 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c54b010-339a-4999-ac4a-6ea1396c19ba" containerName="container-00" Nov 25 13:12:57 crc kubenswrapper[4693]: E1125 13:12:57.696785 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70c79042-9553-4044-845a-a164e846c298" containerName="extract-content" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.696797 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="70c79042-9553-4044-845a-a164e846c298" containerName="extract-content" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.697148 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="70c79042-9553-4044-845a-a164e846c298" containerName="registry-server" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.697187 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c54b010-339a-4999-ac4a-6ea1396c19ba" containerName="container-00" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.698308 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/crc-debug-h96vq" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.845921 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2cf8038-cb70-4713-90ba-7c1b75da668b-host\") pod \"crc-debug-h96vq\" (UID: \"e2cf8038-cb70-4713-90ba-7c1b75da668b\") " pod="openshift-must-gather-8svzn/crc-debug-h96vq" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.846207 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzz6n\" (UniqueName: \"kubernetes.io/projected/e2cf8038-cb70-4713-90ba-7c1b75da668b-kube-api-access-rzz6n\") pod \"crc-debug-h96vq\" (UID: \"e2cf8038-cb70-4713-90ba-7c1b75da668b\") " pod="openshift-must-gather-8svzn/crc-debug-h96vq" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.948553 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzz6n\" (UniqueName: \"kubernetes.io/projected/e2cf8038-cb70-4713-90ba-7c1b75da668b-kube-api-access-rzz6n\") pod \"crc-debug-h96vq\" (UID: \"e2cf8038-cb70-4713-90ba-7c1b75da668b\") " pod="openshift-must-gather-8svzn/crc-debug-h96vq" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.948704 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2cf8038-cb70-4713-90ba-7c1b75da668b-host\") pod \"crc-debug-h96vq\" (UID: \"e2cf8038-cb70-4713-90ba-7c1b75da668b\") " pod="openshift-must-gather-8svzn/crc-debug-h96vq" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.949567 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2cf8038-cb70-4713-90ba-7c1b75da668b-host\") pod \"crc-debug-h96vq\" (UID: \"e2cf8038-cb70-4713-90ba-7c1b75da668b\") " pod="openshift-must-gather-8svzn/crc-debug-h96vq" Nov 25 13:12:57 crc kubenswrapper[4693]: I1125 13:12:57.974940 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzz6n\" (UniqueName: \"kubernetes.io/projected/e2cf8038-cb70-4713-90ba-7c1b75da668b-kube-api-access-rzz6n\") pod \"crc-debug-h96vq\" (UID: \"e2cf8038-cb70-4713-90ba-7c1b75da668b\") " pod="openshift-must-gather-8svzn/crc-debug-h96vq" Nov 25 13:12:58 crc kubenswrapper[4693]: I1125 13:12:58.024533 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/crc-debug-h96vq" Nov 25 13:12:58 crc kubenswrapper[4693]: I1125 13:12:58.411897 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8svzn/crc-debug-h96vq" event={"ID":"e2cf8038-cb70-4713-90ba-7c1b75da668b","Type":"ContainerStarted","Data":"432164f545ed3d53e13aa32ec69386030ffb9d47e3872aa8e66d97f472554492"} Nov 25 13:12:59 crc kubenswrapper[4693]: E1125 13:12:59.112041 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2cf8038_cb70_4713_90ba_7c1b75da668b.slice/crio-conmon-9367e0d2e900880560430e1757732c40eddd777ae543cf876df74e5059bf2fcd.scope\": RecentStats: unable to find data in memory cache]" Nov 25 13:12:59 crc kubenswrapper[4693]: I1125 13:12:59.423790 4693 generic.go:334] "Generic (PLEG): container finished" podID="e2cf8038-cb70-4713-90ba-7c1b75da668b" containerID="9367e0d2e900880560430e1757732c40eddd777ae543cf876df74e5059bf2fcd" exitCode=0 Nov 25 13:12:59 crc kubenswrapper[4693]: I1125 13:12:59.423833 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8svzn/crc-debug-h96vq" event={"ID":"e2cf8038-cb70-4713-90ba-7c1b75da668b","Type":"ContainerDied","Data":"9367e0d2e900880560430e1757732c40eddd777ae543cf876df74e5059bf2fcd"} Nov 25 13:12:59 crc kubenswrapper[4693]: I1125 13:12:59.950108 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8svzn/crc-debug-h96vq"] Nov 25 13:12:59 crc kubenswrapper[4693]: I1125 13:12:59.959285 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8svzn/crc-debug-h96vq"] Nov 25 13:13:00 crc kubenswrapper[4693]: I1125 13:13:00.544093 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/crc-debug-h96vq" Nov 25 13:13:00 crc kubenswrapper[4693]: I1125 13:13:00.699957 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzz6n\" (UniqueName: \"kubernetes.io/projected/e2cf8038-cb70-4713-90ba-7c1b75da668b-kube-api-access-rzz6n\") pod \"e2cf8038-cb70-4713-90ba-7c1b75da668b\" (UID: \"e2cf8038-cb70-4713-90ba-7c1b75da668b\") " Nov 25 13:13:00 crc kubenswrapper[4693]: I1125 13:13:00.700154 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2cf8038-cb70-4713-90ba-7c1b75da668b-host\") pod \"e2cf8038-cb70-4713-90ba-7c1b75da668b\" (UID: \"e2cf8038-cb70-4713-90ba-7c1b75da668b\") " Nov 25 13:13:00 crc kubenswrapper[4693]: I1125 13:13:00.700347 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2cf8038-cb70-4713-90ba-7c1b75da668b-host" (OuterVolumeSpecName: "host") pod "e2cf8038-cb70-4713-90ba-7c1b75da668b" (UID: "e2cf8038-cb70-4713-90ba-7c1b75da668b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 13:13:00 crc kubenswrapper[4693]: I1125 13:13:00.701158 4693 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2cf8038-cb70-4713-90ba-7c1b75da668b-host\") on node \"crc\" DevicePath \"\"" Nov 25 13:13:00 crc kubenswrapper[4693]: I1125 13:13:00.709593 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2cf8038-cb70-4713-90ba-7c1b75da668b-kube-api-access-rzz6n" (OuterVolumeSpecName: "kube-api-access-rzz6n") pod "e2cf8038-cb70-4713-90ba-7c1b75da668b" (UID: "e2cf8038-cb70-4713-90ba-7c1b75da668b"). InnerVolumeSpecName "kube-api-access-rzz6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:13:00 crc kubenswrapper[4693]: I1125 13:13:00.802410 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzz6n\" (UniqueName: \"kubernetes.io/projected/e2cf8038-cb70-4713-90ba-7c1b75da668b-kube-api-access-rzz6n\") on node \"crc\" DevicePath \"\"" Nov 25 13:13:00 crc kubenswrapper[4693]: I1125 13:13:00.826288 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2cf8038-cb70-4713-90ba-7c1b75da668b" path="/var/lib/kubelet/pods/e2cf8038-cb70-4713-90ba-7c1b75da668b/volumes" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.100158 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8svzn/crc-debug-h8488"] Nov 25 13:13:01 crc kubenswrapper[4693]: E1125 13:13:01.100776 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cf8038-cb70-4713-90ba-7c1b75da668b" containerName="container-00" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.100858 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cf8038-cb70-4713-90ba-7c1b75da668b" containerName="container-00" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.101151 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cf8038-cb70-4713-90ba-7c1b75da668b" containerName="container-00" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.101840 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/crc-debug-h8488" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.107390 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4lsb\" (UniqueName: \"kubernetes.io/projected/1643e49c-0b38-4a76-a4d8-8768fea8a113-kube-api-access-p4lsb\") pod \"crc-debug-h8488\" (UID: \"1643e49c-0b38-4a76-a4d8-8768fea8a113\") " pod="openshift-must-gather-8svzn/crc-debug-h8488" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.107489 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1643e49c-0b38-4a76-a4d8-8768fea8a113-host\") pod \"crc-debug-h8488\" (UID: \"1643e49c-0b38-4a76-a4d8-8768fea8a113\") " pod="openshift-must-gather-8svzn/crc-debug-h8488" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.209908 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4lsb\" (UniqueName: \"kubernetes.io/projected/1643e49c-0b38-4a76-a4d8-8768fea8a113-kube-api-access-p4lsb\") pod \"crc-debug-h8488\" (UID: \"1643e49c-0b38-4a76-a4d8-8768fea8a113\") " pod="openshift-must-gather-8svzn/crc-debug-h8488" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.210045 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1643e49c-0b38-4a76-a4d8-8768fea8a113-host\") pod \"crc-debug-h8488\" (UID: \"1643e49c-0b38-4a76-a4d8-8768fea8a113\") " pod="openshift-must-gather-8svzn/crc-debug-h8488" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.210224 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1643e49c-0b38-4a76-a4d8-8768fea8a113-host\") pod \"crc-debug-h8488\" (UID: \"1643e49c-0b38-4a76-a4d8-8768fea8a113\") " pod="openshift-must-gather-8svzn/crc-debug-h8488" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.229320 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4lsb\" (UniqueName: \"kubernetes.io/projected/1643e49c-0b38-4a76-a4d8-8768fea8a113-kube-api-access-p4lsb\") pod \"crc-debug-h8488\" (UID: \"1643e49c-0b38-4a76-a4d8-8768fea8a113\") " pod="openshift-must-gather-8svzn/crc-debug-h8488" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.418281 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/crc-debug-h8488" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.444874 4693 scope.go:117] "RemoveContainer" containerID="9367e0d2e900880560430e1757732c40eddd777ae543cf876df74e5059bf2fcd" Nov 25 13:13:01 crc kubenswrapper[4693]: I1125 13:13:01.444904 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/crc-debug-h96vq" Nov 25 13:13:01 crc kubenswrapper[4693]: W1125 13:13:01.455130 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1643e49c_0b38_4a76_a4d8_8768fea8a113.slice/crio-016e0d832ca7fca10e9476b931dcc7df3cdef0146266a0fa759b5da344a20f85 WatchSource:0}: Error finding container 016e0d832ca7fca10e9476b931dcc7df3cdef0146266a0fa759b5da344a20f85: Status 404 returned error can't find the container with id 016e0d832ca7fca10e9476b931dcc7df3cdef0146266a0fa759b5da344a20f85 Nov 25 13:13:02 crc kubenswrapper[4693]: I1125 13:13:02.455410 4693 generic.go:334] "Generic (PLEG): container finished" podID="1643e49c-0b38-4a76-a4d8-8768fea8a113" containerID="d6410b813380160c0f6967e8b5cea309421f2028d2c3f4ed5a0c9e39ef6c1674" exitCode=0 Nov 25 13:13:02 crc kubenswrapper[4693]: I1125 13:13:02.455476 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8svzn/crc-debug-h8488" event={"ID":"1643e49c-0b38-4a76-a4d8-8768fea8a113","Type":"ContainerDied","Data":"d6410b813380160c0f6967e8b5cea309421f2028d2c3f4ed5a0c9e39ef6c1674"} Nov 25 13:13:02 crc kubenswrapper[4693]: I1125 13:13:02.456553 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8svzn/crc-debug-h8488" event={"ID":"1643e49c-0b38-4a76-a4d8-8768fea8a113","Type":"ContainerStarted","Data":"016e0d832ca7fca10e9476b931dcc7df3cdef0146266a0fa759b5da344a20f85"} Nov 25 13:13:02 crc kubenswrapper[4693]: I1125 13:13:02.495762 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8svzn/crc-debug-h8488"] Nov 25 13:13:02 crc kubenswrapper[4693]: I1125 13:13:02.504772 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8svzn/crc-debug-h8488"] Nov 25 13:13:03 crc kubenswrapper[4693]: I1125 13:13:03.581147 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/crc-debug-h8488" Nov 25 13:13:03 crc kubenswrapper[4693]: I1125 13:13:03.659593 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1643e49c-0b38-4a76-a4d8-8768fea8a113-host\") pod \"1643e49c-0b38-4a76-a4d8-8768fea8a113\" (UID: \"1643e49c-0b38-4a76-a4d8-8768fea8a113\") " Nov 25 13:13:03 crc kubenswrapper[4693]: I1125 13:13:03.659724 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4lsb\" (UniqueName: \"kubernetes.io/projected/1643e49c-0b38-4a76-a4d8-8768fea8a113-kube-api-access-p4lsb\") pod \"1643e49c-0b38-4a76-a4d8-8768fea8a113\" (UID: \"1643e49c-0b38-4a76-a4d8-8768fea8a113\") " Nov 25 13:13:03 crc kubenswrapper[4693]: I1125 13:13:03.659731 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1643e49c-0b38-4a76-a4d8-8768fea8a113-host" (OuterVolumeSpecName: "host") pod "1643e49c-0b38-4a76-a4d8-8768fea8a113" (UID: "1643e49c-0b38-4a76-a4d8-8768fea8a113"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 13:13:03 crc kubenswrapper[4693]: I1125 13:13:03.660134 4693 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1643e49c-0b38-4a76-a4d8-8768fea8a113-host\") on node \"crc\" DevicePath \"\"" Nov 25 13:13:03 crc kubenswrapper[4693]: I1125 13:13:03.665566 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1643e49c-0b38-4a76-a4d8-8768fea8a113-kube-api-access-p4lsb" (OuterVolumeSpecName: "kube-api-access-p4lsb") pod "1643e49c-0b38-4a76-a4d8-8768fea8a113" (UID: "1643e49c-0b38-4a76-a4d8-8768fea8a113"). InnerVolumeSpecName "kube-api-access-p4lsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:13:03 crc kubenswrapper[4693]: I1125 13:13:03.762431 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4lsb\" (UniqueName: \"kubernetes.io/projected/1643e49c-0b38-4a76-a4d8-8768fea8a113-kube-api-access-p4lsb\") on node \"crc\" DevicePath \"\"" Nov 25 13:13:04 crc kubenswrapper[4693]: I1125 13:13:04.478670 4693 scope.go:117] "RemoveContainer" containerID="d6410b813380160c0f6967e8b5cea309421f2028d2c3f4ed5a0c9e39ef6c1674" Nov 25 13:13:04 crc kubenswrapper[4693]: I1125 13:13:04.478696 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/crc-debug-h8488" Nov 25 13:13:04 crc kubenswrapper[4693]: I1125 13:13:04.826043 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1643e49c-0b38-4a76-a4d8-8768fea8a113" path="/var/lib/kubelet/pods/1643e49c-0b38-4a76-a4d8-8768fea8a113/volumes" Nov 25 13:13:06 crc kubenswrapper[4693]: I1125 13:13:06.813978 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:13:06 crc kubenswrapper[4693]: E1125 13:13:06.814446 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:13:19 crc kubenswrapper[4693]: I1125 13:13:19.812892 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:13:19 crc kubenswrapper[4693]: E1125 13:13:19.813665 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:13:19 crc kubenswrapper[4693]: I1125 13:13:19.902563 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5c49f9f854-clbv4_0c770547-1b83-43a7-ac47-82226ce02958/barbican-api/0.log" Nov 25 13:13:20 crc kubenswrapper[4693]: I1125 13:13:20.146716 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5c49f9f854-clbv4_0c770547-1b83-43a7-ac47-82226ce02958/barbican-api-log/0.log" Nov 25 13:13:20 crc kubenswrapper[4693]: I1125 13:13:20.239006 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d949d856b-fbdcg_c743d467-4bdb-41ce-bf74-5051a93fc3d6/barbican-keystone-listener-log/0.log" Nov 25 13:13:20 crc kubenswrapper[4693]: I1125 13:13:20.265324 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d949d856b-fbdcg_c743d467-4bdb-41ce-bf74-5051a93fc3d6/barbican-keystone-listener/0.log" Nov 25 13:13:20 crc kubenswrapper[4693]: I1125 13:13:20.485811 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d9fc5569-wdqkp_084144ce-d043-4dd8-bc4b-e904c42e47cd/barbican-worker-log/0.log" Nov 25 13:13:20 crc kubenswrapper[4693]: I1125 13:13:20.502409 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d9fc5569-wdqkp_084144ce-d043-4dd8-bc4b-e904c42e47cd/barbican-worker/0.log" Nov 25 13:13:20 crc kubenswrapper[4693]: I1125 13:13:20.692447 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-46z97_0c125840-c37c-445e-95d9-37c74703ea85/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:20 crc kubenswrapper[4693]: I1125 13:13:20.785888 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2e6a63b2-2650-4dc4-9a37-d7be65342b5d/ceilometer-central-agent/0.log" Nov 25 13:13:20 crc kubenswrapper[4693]: I1125 13:13:20.821311 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2e6a63b2-2650-4dc4-9a37-d7be65342b5d/ceilometer-notification-agent/0.log" Nov 25 13:13:20 crc kubenswrapper[4693]: I1125 13:13:20.896630 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2e6a63b2-2650-4dc4-9a37-d7be65342b5d/proxy-httpd/0.log" Nov 25 13:13:20 crc kubenswrapper[4693]: I1125 13:13:20.942801 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2e6a63b2-2650-4dc4-9a37-d7be65342b5d/sg-core/0.log" Nov 25 13:13:21 crc kubenswrapper[4693]: I1125 13:13:21.143450 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8aeba5db-6f5b-4714-9d46-5db9b9058cb6/cinder-api/0.log" Nov 25 13:13:21 crc kubenswrapper[4693]: I1125 13:13:21.186536 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8aeba5db-6f5b-4714-9d46-5db9b9058cb6/cinder-api-log/0.log" Nov 25 13:13:21 crc kubenswrapper[4693]: I1125 13:13:21.287813 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_42d5e91d-841b-453a-a5db-f2d1bf40fbec/cinder-scheduler/0.log" Nov 25 13:13:21 crc kubenswrapper[4693]: I1125 13:13:21.417679 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_42d5e91d-841b-453a-a5db-f2d1bf40fbec/probe/0.log" Nov 25 13:13:21 crc kubenswrapper[4693]: I1125 13:13:21.535551 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-qfskr_216fd77e-1bfd-4c99-8fd8-2711d9de6beb/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:21 crc kubenswrapper[4693]: I1125 13:13:21.638060 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd_9667d434-5214-4754-baea-bcc266b58358/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:21 crc kubenswrapper[4693]: I1125 13:13:21.766733 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d5cf5b645-f87gr_fcc8f52b-776d-4a49-b62f-bf73fcc35fe0/init/0.log" Nov 25 13:13:21 crc kubenswrapper[4693]: I1125 13:13:21.929313 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d5cf5b645-f87gr_fcc8f52b-776d-4a49-b62f-bf73fcc35fe0/init/0.log" Nov 25 13:13:21 crc kubenswrapper[4693]: I1125 13:13:21.968735 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2_81f5f268-3ead-442b-ae8f-d7e2c11a6752/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:22 crc kubenswrapper[4693]: I1125 13:13:22.032545 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d5cf5b645-f87gr_fcc8f52b-776d-4a49-b62f-bf73fcc35fe0/dnsmasq-dns/0.log" Nov 25 13:13:22 crc kubenswrapper[4693]: I1125 13:13:22.183411 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bb2f0f2d-5d66-485f-a389-e07c52f143f2/glance-httpd/0.log" Nov 25 13:13:22 crc kubenswrapper[4693]: I1125 13:13:22.257767 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bb2f0f2d-5d66-485f-a389-e07c52f143f2/glance-log/0.log" Nov 25 13:13:22 crc kubenswrapper[4693]: I1125 13:13:22.564189 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_711404f8-4ff3-44b1-b4f5-dfdc70ac930f/glance-log/0.log" Nov 25 13:13:22 crc kubenswrapper[4693]: I1125 13:13:22.570237 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_711404f8-4ff3-44b1-b4f5-dfdc70ac930f/glance-httpd/0.log" Nov 25 13:13:22 crc kubenswrapper[4693]: I1125 13:13:22.759537 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-574fd6fdfd-bz6sm_14ff5a36-1912-43a8-b87f-57a6858a5799/horizon/0.log" Nov 25 13:13:23 crc kubenswrapper[4693]: I1125 13:13:23.017248 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr_6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:23 crc kubenswrapper[4693]: I1125 13:13:23.164136 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kvl2g_c0c79f30-0e24-4101-8632-19de1642f7e2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:23 crc kubenswrapper[4693]: I1125 13:13:23.189678 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-574fd6fdfd-bz6sm_14ff5a36-1912-43a8-b87f-57a6858a5799/horizon-log/0.log" Nov 25 13:13:23 crc kubenswrapper[4693]: I1125 13:13:23.368442 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7888468d67-2bztz_e3c87b9d-25f9-445f-be14-b43f1cb887a4/keystone-api/0.log" Nov 25 13:13:23 crc kubenswrapper[4693]: I1125 13:13:23.439639 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29401261-7sn2t_3306a30d-dcff-4460-81f2-3561573e57a2/keystone-cron/0.log" Nov 25 13:13:23 crc kubenswrapper[4693]: I1125 13:13:23.640497 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ee5b4281-3cdb-4bad-8002-8520136232a4/kube-state-metrics/3.log" Nov 25 13:13:23 crc kubenswrapper[4693]: I1125 13:13:23.699747 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ee5b4281-3cdb-4bad-8002-8520136232a4/kube-state-metrics/2.log" Nov 25 13:13:23 crc kubenswrapper[4693]: I1125 13:13:23.857515 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-njx62_df6120d2-3571-4059-8fb1-d40741960cff/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:24 crc kubenswrapper[4693]: I1125 13:13:24.169861 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57c497f557-r9sp7_66ccc10f-a153-4582-ab8d-f687b0c6bb20/neutron-httpd/0.log" Nov 25 13:13:24 crc kubenswrapper[4693]: I1125 13:13:24.197032 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57c497f557-r9sp7_66ccc10f-a153-4582-ab8d-f687b0c6bb20/neutron-api/0.log" Nov 25 13:13:24 crc kubenswrapper[4693]: I1125 13:13:24.290219 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz_3e1334ad-6a95-4d72-95a2-5dfa8d78e530/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:24 crc kubenswrapper[4693]: I1125 13:13:24.795676 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_37605f96-b59e-45ad-b177-dad562d6af05/nova-api-log/0.log" Nov 25 13:13:24 crc kubenswrapper[4693]: I1125 13:13:24.932047 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bdc79fdf-d996-42ba-b250-2501738ed0bc/nova-cell0-conductor-conductor/0.log" Nov 25 13:13:25 crc kubenswrapper[4693]: I1125 13:13:25.055553 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_37605f96-b59e-45ad-b177-dad562d6af05/nova-api-api/0.log" Nov 25 13:13:25 crc kubenswrapper[4693]: I1125 13:13:25.227832 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_35d79906-6ec5-4483-83ef-ae2ff2674c86/nova-cell1-conductor-conductor/0.log" Nov 25 13:13:25 crc kubenswrapper[4693]: I1125 13:13:25.393070 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f2e39ed4-ac1c-4961-80a1-24b93bed8f4b/nova-cell1-novncproxy-novncproxy/0.log" Nov 25 13:13:25 crc kubenswrapper[4693]: I1125 13:13:25.500423 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-q4ddm_2a152944-4c08-47ee-bc41-90fa01d90bb1/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:25 crc kubenswrapper[4693]: I1125 13:13:25.535953 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_599f11ff-5079-4815-be45-5ffac410eb82/nova-metadata-log/0.log" Nov 25 13:13:25 crc kubenswrapper[4693]: I1125 13:13:25.901095 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_4eca8fd3-dd93-493a-9278-1749de83eae1/nova-scheduler-scheduler/0.log" Nov 25 13:13:26 crc kubenswrapper[4693]: I1125 13:13:26.021984 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9fc3b8be-d4cc-4bb4-86f0-5516294c1221/mysql-bootstrap/0.log" Nov 25 13:13:26 crc kubenswrapper[4693]: I1125 13:13:26.149851 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9fc3b8be-d4cc-4bb4-86f0-5516294c1221/mysql-bootstrap/0.log" Nov 25 13:13:26 crc kubenswrapper[4693]: I1125 13:13:26.206552 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9fc3b8be-d4cc-4bb4-86f0-5516294c1221/galera/0.log" Nov 25 13:13:26 crc kubenswrapper[4693]: I1125 13:13:26.394359 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4cf2be5d-1c6c-402f-bf93-e9653a6a84cd/mysql-bootstrap/0.log" Nov 25 13:13:26 crc kubenswrapper[4693]: I1125 13:13:26.605813 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4cf2be5d-1c6c-402f-bf93-e9653a6a84cd/mysql-bootstrap/0.log" Nov 25 13:13:26 crc kubenswrapper[4693]: I1125 13:13:26.615357 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4cf2be5d-1c6c-402f-bf93-e9653a6a84cd/galera/0.log" Nov 25 13:13:26 crc kubenswrapper[4693]: I1125 13:13:26.821977 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d51f97b0-16ac-43b8-aa77-b2a66faef2cd/openstackclient/0.log" Nov 25 13:13:26 crc kubenswrapper[4693]: I1125 13:13:26.899077 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_599f11ff-5079-4815-be45-5ffac410eb82/nova-metadata-metadata/0.log" Nov 25 13:13:26 crc kubenswrapper[4693]: I1125 13:13:26.928879 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-szpg5_266234f1-8683-4a0d-a1ec-42cd82184f11/openstack-network-exporter/0.log" Nov 25 13:13:27 crc kubenswrapper[4693]: I1125 13:13:27.153158 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ndgsx_96236f54-53d2-47df-854b-51addeda1dee/ovn-controller/0.log" Nov 25 13:13:27 crc kubenswrapper[4693]: I1125 13:13:27.198278 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8vhnn_93d2601b-fc82-478d-8667-dbce77606f4d/ovsdb-server-init/0.log" Nov 25 13:13:27 crc kubenswrapper[4693]: I1125 13:13:27.359560 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8vhnn_93d2601b-fc82-478d-8667-dbce77606f4d/ovs-vswitchd/0.log" Nov 25 13:13:27 crc kubenswrapper[4693]: I1125 13:13:27.430672 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8vhnn_93d2601b-fc82-478d-8667-dbce77606f4d/ovsdb-server-init/0.log" Nov 25 13:13:27 crc kubenswrapper[4693]: I1125 13:13:27.479275 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8vhnn_93d2601b-fc82-478d-8667-dbce77606f4d/ovsdb-server/0.log" Nov 25 13:13:27 crc kubenswrapper[4693]: I1125 13:13:27.767009 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7ct5k_d7282171-b6bf-44b4-a5a3-f60d6d5baa5f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:27 crc kubenswrapper[4693]: I1125 13:13:27.875864 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5de2ed94-055d-4e4b-b069-3bcafd88cc3f/openstack-network-exporter/0.log" Nov 25 13:13:27 crc kubenswrapper[4693]: I1125 13:13:27.879247 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5de2ed94-055d-4e4b-b069-3bcafd88cc3f/ovn-northd/0.log" Nov 25 13:13:28 crc kubenswrapper[4693]: I1125 13:13:28.065996 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4d617274-42b9-4d07-b321-d70a5aeba8ee/openstack-network-exporter/0.log" Nov 25 13:13:28 crc kubenswrapper[4693]: I1125 13:13:28.159324 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4d617274-42b9-4d07-b321-d70a5aeba8ee/ovsdbserver-nb/0.log" Nov 25 13:13:28 crc kubenswrapper[4693]: I1125 13:13:28.257779 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_89a481b1-6040-4f15-a63f-d6d2301c3534/openstack-network-exporter/0.log" Nov 25 13:13:28 crc kubenswrapper[4693]: I1125 13:13:28.285763 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_89a481b1-6040-4f15-a63f-d6d2301c3534/ovsdbserver-sb/0.log" Nov 25 13:13:28 crc kubenswrapper[4693]: I1125 13:13:28.484727 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7bf98548b6-68m92_d7f68eff-0e73-43ec-bb9a-97fd321b92ec/placement-api/0.log" Nov 25 13:13:28 crc kubenswrapper[4693]: I1125 13:13:28.590559 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7bf98548b6-68m92_d7f68eff-0e73-43ec-bb9a-97fd321b92ec/placement-log/0.log" Nov 25 13:13:28 crc kubenswrapper[4693]: I1125 13:13:28.691174 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5c73e56b-c0f3-4d6d-9e33-26fe0d552e24/setup-container/0.log" Nov 25 13:13:28 crc kubenswrapper[4693]: I1125 13:13:28.810803 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5c73e56b-c0f3-4d6d-9e33-26fe0d552e24/setup-container/0.log" Nov 25 13:13:28 crc kubenswrapper[4693]: I1125 13:13:28.947678 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5c73e56b-c0f3-4d6d-9e33-26fe0d552e24/rabbitmq/0.log" Nov 25 13:13:28 crc kubenswrapper[4693]: I1125 13:13:28.963091 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_98b0bc68-9551-407d-8390-66688e8255d3/setup-container/0.log" Nov 25 13:13:29 crc kubenswrapper[4693]: I1125 13:13:29.205810 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7_622af4c3-4b56-4b3c-8ea2-6d30432a706a/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:29 crc kubenswrapper[4693]: I1125 13:13:29.271044 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_98b0bc68-9551-407d-8390-66688e8255d3/setup-container/0.log" Nov 25 13:13:29 crc kubenswrapper[4693]: I1125 13:13:29.277687 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_98b0bc68-9551-407d-8390-66688e8255d3/rabbitmq/0.log" Nov 25 13:13:29 crc kubenswrapper[4693]: I1125 13:13:29.454308 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-65zzc_70ae1b8a-3af0-4d98-a633-6933a83b2b71/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:29 crc kubenswrapper[4693]: I1125 13:13:29.529761 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k_3f8577c4-f507-4e40-b284-66d57b0aee3d/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:29 crc kubenswrapper[4693]: I1125 13:13:29.661285 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-vwwwd_a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:29 crc kubenswrapper[4693]: I1125 13:13:29.769570 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cx4gc_3200a40a-dfa0-40f7-a79d-054de8e9e386/ssh-known-hosts-edpm-deployment/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.005226 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5df97c965f-mdrk8_3bce19a4-5298-4024-b291-19e2d6138081/proxy-server/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.092959 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5df97c965f-mdrk8_3bce19a4-5298-4024-b291-19e2d6138081/proxy-httpd/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.343607 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2kzct_88ff5ba0-ea04-4e77-9f16-05711082df93/swift-ring-rebalance/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.350689 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/account-auditor/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.367549 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/account-reaper/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.555267 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/account-server/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.563040 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/container-auditor/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.623432 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/account-replicator/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.653354 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/container-replicator/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.749170 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/container-server/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.845885 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/container-updater/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.866437 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/object-auditor/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.874707 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/object-expirer/0.log" Nov 25 13:13:30 crc kubenswrapper[4693]: I1125 13:13:30.999287 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/object-replicator/0.log" Nov 25 13:13:31 crc kubenswrapper[4693]: I1125 13:13:31.051284 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/object-server/0.log" Nov 25 13:13:31 crc kubenswrapper[4693]: I1125 13:13:31.064324 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/object-updater/0.log" Nov 25 13:13:31 crc kubenswrapper[4693]: I1125 13:13:31.158228 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/rsync/0.log" Nov 25 13:13:31 crc kubenswrapper[4693]: I1125 13:13:31.212715 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/swift-recon-cron/0.log" Nov 25 13:13:31 crc kubenswrapper[4693]: I1125 13:13:31.358815 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-h89kr_ebbe9089-3f4f-46c6-a5ea-ff523e970069/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:31 crc kubenswrapper[4693]: I1125 13:13:31.472434 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_9b66ebc6-a0f0-4418-8d28-7364b1f5d177/tempest-tests-tempest-tests-runner/0.log" Nov 25 13:13:31 crc kubenswrapper[4693]: I1125 13:13:31.640991 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_54397ebd-dc91-441f-9c68-261a1c952589/test-operator-logs-container/0.log" Nov 25 13:13:31 crc kubenswrapper[4693]: I1125 13:13:31.701842 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv_6afe8ee4-7d98-4751-a224-b99437561d70/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:13:33 crc kubenswrapper[4693]: I1125 13:13:33.812769 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:13:33 crc kubenswrapper[4693]: E1125 13:13:33.813454 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:13:43 crc kubenswrapper[4693]: I1125 13:13:43.550898 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_459f5353-15bd-4139-a363-7a1bf6fe94cf/memcached/0.log" Nov 25 13:13:47 crc kubenswrapper[4693]: I1125 13:13:47.812474 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:13:47 crc kubenswrapper[4693]: E1125 13:13:47.813197 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.675451 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-82dm8"] Nov 25 13:13:53 crc kubenswrapper[4693]: E1125 13:13:53.676875 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1643e49c-0b38-4a76-a4d8-8768fea8a113" containerName="container-00" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.676893 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="1643e49c-0b38-4a76-a4d8-8768fea8a113" containerName="container-00" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.677183 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="1643e49c-0b38-4a76-a4d8-8768fea8a113" containerName="container-00" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.679171 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.712090 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-82dm8"] Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.797528 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87cbcdc8-d856-4b47-abb9-bbde836b0adf-catalog-content\") pod \"community-operators-82dm8\" (UID: \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\") " pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.797593 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4xhc\" (UniqueName: \"kubernetes.io/projected/87cbcdc8-d856-4b47-abb9-bbde836b0adf-kube-api-access-k4xhc\") pod \"community-operators-82dm8\" (UID: \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\") " pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.797622 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87cbcdc8-d856-4b47-abb9-bbde836b0adf-utilities\") pod \"community-operators-82dm8\" (UID: \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\") " pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.899570 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87cbcdc8-d856-4b47-abb9-bbde836b0adf-catalog-content\") pod \"community-operators-82dm8\" (UID: \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\") " pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.899642 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4xhc\" (UniqueName: \"kubernetes.io/projected/87cbcdc8-d856-4b47-abb9-bbde836b0adf-kube-api-access-k4xhc\") pod \"community-operators-82dm8\" (UID: \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\") " pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.899667 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87cbcdc8-d856-4b47-abb9-bbde836b0adf-utilities\") pod \"community-operators-82dm8\" (UID: \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\") " pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.900137 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87cbcdc8-d856-4b47-abb9-bbde836b0adf-catalog-content\") pod \"community-operators-82dm8\" (UID: \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\") " pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.900187 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87cbcdc8-d856-4b47-abb9-bbde836b0adf-utilities\") pod \"community-operators-82dm8\" (UID: \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\") " pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:13:53 crc kubenswrapper[4693]: I1125 13:13:53.930087 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4xhc\" (UniqueName: \"kubernetes.io/projected/87cbcdc8-d856-4b47-abb9-bbde836b0adf-kube-api-access-k4xhc\") pod \"community-operators-82dm8\" (UID: \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\") " pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:13:54 crc kubenswrapper[4693]: I1125 13:13:54.000988 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:13:54 crc kubenswrapper[4693]: I1125 13:13:54.631473 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-82dm8"] Nov 25 13:13:54 crc kubenswrapper[4693]: I1125 13:13:54.902219 4693 generic.go:334] "Generic (PLEG): container finished" podID="87cbcdc8-d856-4b47-abb9-bbde836b0adf" containerID="143499e43b12dcb6c1818177aba127a860e97cb058230a57ac4e3946a71eae17" exitCode=0 Nov 25 13:13:54 crc kubenswrapper[4693]: I1125 13:13:54.902262 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82dm8" event={"ID":"87cbcdc8-d856-4b47-abb9-bbde836b0adf","Type":"ContainerDied","Data":"143499e43b12dcb6c1818177aba127a860e97cb058230a57ac4e3946a71eae17"} Nov 25 13:13:54 crc kubenswrapper[4693]: I1125 13:13:54.902290 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82dm8" event={"ID":"87cbcdc8-d856-4b47-abb9-bbde836b0adf","Type":"ContainerStarted","Data":"1d2ff57820cc01400d6f66d1b0018caf7bf043f4efec6d99c93cd53f177ff914"} Nov 25 13:13:55 crc kubenswrapper[4693]: I1125 13:13:55.913010 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82dm8" event={"ID":"87cbcdc8-d856-4b47-abb9-bbde836b0adf","Type":"ContainerStarted","Data":"e4dec904b5f6169613acf7b08587a4114f1cda178082965dbf4fa59ee20f5ca7"} Nov 25 13:13:56 crc kubenswrapper[4693]: I1125 13:13:56.924813 4693 generic.go:334] "Generic (PLEG): container finished" podID="87cbcdc8-d856-4b47-abb9-bbde836b0adf" containerID="e4dec904b5f6169613acf7b08587a4114f1cda178082965dbf4fa59ee20f5ca7" exitCode=0 Nov 25 13:13:56 crc kubenswrapper[4693]: I1125 13:13:56.924854 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82dm8" event={"ID":"87cbcdc8-d856-4b47-abb9-bbde836b0adf","Type":"ContainerDied","Data":"e4dec904b5f6169613acf7b08587a4114f1cda178082965dbf4fa59ee20f5ca7"} Nov 25 13:13:57 crc kubenswrapper[4693]: I1125 13:13:57.942524 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82dm8" event={"ID":"87cbcdc8-d856-4b47-abb9-bbde836b0adf","Type":"ContainerStarted","Data":"ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94"} Nov 25 13:13:57 crc kubenswrapper[4693]: I1125 13:13:57.976382 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-82dm8" podStartSLOduration=2.321487828 podStartE2EDuration="4.976350315s" podCreationTimestamp="2025-11-25 13:13:53 +0000 UTC" firstStartedPulling="2025-11-25 13:13:54.906231107 +0000 UTC m=+3954.824316488" lastFinishedPulling="2025-11-25 13:13:57.561093594 +0000 UTC m=+3957.479178975" observedRunningTime="2025-11-25 13:13:57.970763614 +0000 UTC m=+3957.888848995" watchObservedRunningTime="2025-11-25 13:13:57.976350315 +0000 UTC m=+3957.894435696" Nov 25 13:13:58 crc kubenswrapper[4693]: I1125 13:13:58.711136 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-6wxtj_2f11c884-15fc-4e2a-a533-d0eac0639f80/kube-rbac-proxy/0.log" Nov 25 13:13:58 crc kubenswrapper[4693]: I1125 13:13:58.795152 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-6wxtj_2f11c884-15fc-4e2a-a533-d0eac0639f80/manager/2.log" Nov 25 13:13:58 crc kubenswrapper[4693]: I1125 13:13:58.931771 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-6wxtj_2f11c884-15fc-4e2a-a533-d0eac0639f80/manager/1.log" Nov 25 13:13:58 crc kubenswrapper[4693]: I1125 13:13:58.963873 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/util/0.log" Nov 25 13:13:59 crc kubenswrapper[4693]: I1125 13:13:59.164852 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/pull/0.log" Nov 25 13:13:59 crc kubenswrapper[4693]: I1125 13:13:59.192007 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/util/0.log" Nov 25 13:13:59 crc kubenswrapper[4693]: I1125 13:13:59.220941 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/pull/0.log" Nov 25 13:13:59 crc kubenswrapper[4693]: I1125 13:13:59.413220 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/util/0.log" Nov 25 13:13:59 crc kubenswrapper[4693]: I1125 13:13:59.435447 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/pull/0.log" Nov 25 13:13:59 crc kubenswrapper[4693]: I1125 13:13:59.470988 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/extract/0.log" Nov 25 13:13:59 crc kubenswrapper[4693]: I1125 13:13:59.668671 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-4lt8v_4ab70f55-282f-4509-bc36-71ef2fe4d35b/kube-rbac-proxy/0.log" Nov 25 13:13:59 crc kubenswrapper[4693]: I1125 13:13:59.688414 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-4lt8v_4ab70f55-282f-4509-bc36-71ef2fe4d35b/manager/2.log" Nov 25 13:13:59 crc kubenswrapper[4693]: I1125 13:13:59.715667 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-4lt8v_4ab70f55-282f-4509-bc36-71ef2fe4d35b/manager/1.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.007054 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-6dtx6_9cc5c4a9-0119-48b6-a795-9f482b55278b/kube-rbac-proxy/0.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.014723 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-6dtx6_9cc5c4a9-0119-48b6-a795-9f482b55278b/manager/2.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.212600 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-866fd_7cb65a4e-3294-4104-b3bf-6d1103b92c38/kube-rbac-proxy/0.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.498838 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-866fd_7cb65a4e-3294-4104-b3bf-6d1103b92c38/manager/1.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.500186 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-6dtx6_9cc5c4a9-0119-48b6-a795-9f482b55278b/manager/1.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.503158 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-866fd_7cb65a4e-3294-4104-b3bf-6d1103b92c38/manager/2.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.553763 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-nzz29_b29c9c21-026a-4701-99a7-769d382a2da2/kube-rbac-proxy/0.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.744946 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-nzz29_b29c9c21-026a-4701-99a7-769d382a2da2/manager/2.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.748854 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-nzz29_b29c9c21-026a-4701-99a7-769d382a2da2/manager/1.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.760897 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-fwwsj_4dd9cd53-1f66-4636-9fab-9f0b3ff38009/kube-rbac-proxy/0.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.972151 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-fwwsj_4dd9cd53-1f66-4636-9fab-9f0b3ff38009/manager/2.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.986030 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-fwwsj_4dd9cd53-1f66-4636-9fab-9f0b3ff38009/manager/1.log" Nov 25 13:14:00 crc kubenswrapper[4693]: I1125 13:14:00.997665 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-r86ct_5c98082e-070e-42b1-afdc-69cea132629e/kube-rbac-proxy/0.log" Nov 25 13:14:01 crc kubenswrapper[4693]: I1125 13:14:01.158472 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-r86ct_5c98082e-070e-42b1-afdc-69cea132629e/manager/1.log" Nov 25 13:14:01 crc kubenswrapper[4693]: I1125 13:14:01.159331 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-r86ct_5c98082e-070e-42b1-afdc-69cea132629e/manager/2.log" Nov 25 13:14:01 crc kubenswrapper[4693]: I1125 13:14:01.221708 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-szrv4_3c29e8b9-57cf-4967-b5e2-a6af42c16099/kube-rbac-proxy/0.log" Nov 25 13:14:01 crc kubenswrapper[4693]: I1125 13:14:01.371014 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-szrv4_3c29e8b9-57cf-4967-b5e2-a6af42c16099/manager/1.log" Nov 25 13:14:01 crc kubenswrapper[4693]: I1125 13:14:01.400456 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-szrv4_3c29e8b9-57cf-4967-b5e2-a6af42c16099/manager/2.log" Nov 25 13:14:01 crc kubenswrapper[4693]: I1125 13:14:01.776612 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-zcpsz_a64b0f5c-e6af-4903-925a-028aec5477fd/kube-rbac-proxy/0.log" Nov 25 13:14:01 crc kubenswrapper[4693]: I1125 13:14:01.812905 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:14:01 crc kubenswrapper[4693]: E1125 13:14:01.813217 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:14:01 crc kubenswrapper[4693]: I1125 13:14:01.864189 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-zcpsz_a64b0f5c-e6af-4903-925a-028aec5477fd/manager/1.log" Nov 25 13:14:01 crc kubenswrapper[4693]: I1125 13:14:01.868621 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-zcpsz_a64b0f5c-e6af-4903-925a-028aec5477fd/manager/2.log" Nov 25 13:14:01 crc kubenswrapper[4693]: I1125 13:14:01.945216 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-5ghnq_bfeee7c1-207f-4862-b172-f2ffab4a1500/kube-rbac-proxy/0.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.043512 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-5ghnq_bfeee7c1-207f-4862-b172-f2ffab4a1500/manager/2.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.113009 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-5ghnq_bfeee7c1-207f-4862-b172-f2ffab4a1500/manager/1.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.149493 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-s9shw_22a83ecc-1f72-4474-a470-2ee4bef7eddf/kube-rbac-proxy/0.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.233329 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-s9shw_22a83ecc-1f72-4474-a470-2ee4bef7eddf/manager/2.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.354932 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-csrpt_0f35f544-581e-4cb2-900f-71213e27477d/kube-rbac-proxy/0.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.482243 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-s9shw_22a83ecc-1f72-4474-a470-2ee4bef7eddf/manager/1.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.495917 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-csrpt_0f35f544-581e-4cb2-900f-71213e27477d/manager/1.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.497898 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-csrpt_0f35f544-581e-4cb2-900f-71213e27477d/manager/2.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.532003 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-flxdz_7ecc8c23-d9b2-4d46-a8b0-76758035b267/kube-rbac-proxy/0.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.818532 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-flxdz_7ecc8c23-d9b2-4d46-a8b0-76758035b267/manager/2.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.841422 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-flxdz_7ecc8c23-d9b2-4d46-a8b0-76758035b267/manager/1.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.856637 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-g972v_fe2a0074-66dc-4730-9321-772ee8fd8e28/manager/2.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.874546 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-g972v_fe2a0074-66dc-4730-9321-772ee8fd8e28/kube-rbac-proxy/0.log" Nov 25 13:14:02 crc kubenswrapper[4693]: I1125 13:14:02.984004 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-g972v_fe2a0074-66dc-4730-9321-772ee8fd8e28/manager/1.log" Nov 25 13:14:03 crc kubenswrapper[4693]: I1125 13:14:03.042013 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b58f89467-jlbhg_a7c4eb9b-38af-41da-872e-b3da515b2f88/kube-rbac-proxy/0.log" Nov 25 13:14:03 crc kubenswrapper[4693]: I1125 13:14:03.124199 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b58f89467-jlbhg_a7c4eb9b-38af-41da-872e-b3da515b2f88/manager/1.log" Nov 25 13:14:03 crc kubenswrapper[4693]: I1125 13:14:03.124973 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b58f89467-jlbhg_a7c4eb9b-38af-41da-872e-b3da515b2f88/manager/0.log" Nov 25 13:14:03 crc kubenswrapper[4693]: I1125 13:14:03.219763 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cd5954d9-rqjq9_c80a0f65-6193-435f-8138-eb5a4ba71b22/manager/1.log" Nov 25 13:14:03 crc kubenswrapper[4693]: I1125 13:14:03.562156 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cd5954d9-rqjq9_c80a0f65-6193-435f-8138-eb5a4ba71b22/manager/2.log" Nov 25 13:14:03 crc kubenswrapper[4693]: I1125 13:14:03.908973 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7b567956b5-gk28d_ebf85cb6-2651-4b5f-9cbe-973db55e14c5/operator/1.log" Nov 25 13:14:03 crc kubenswrapper[4693]: I1125 13:14:03.976977 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7fm7p_b14001de-fa88-4632-87e8-e5a4d703e633/registry-server/0.log" Nov 25 13:14:03 crc kubenswrapper[4693]: I1125 13:14:03.983814 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7b567956b5-gk28d_ebf85cb6-2651-4b5f-9cbe-973db55e14c5/operator/0.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.001258 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.001295 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.047054 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.065599 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-k2njb_1c7db975-17d7-48dd-8e5a-0549749ab866/kube-rbac-proxy/0.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.136481 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-k2njb_1c7db975-17d7-48dd-8e5a-0549749ab866/manager/2.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.184801 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-k2njb_1c7db975-17d7-48dd-8e5a-0549749ab866/manager/1.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.235558 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-f4trp_f6bc1c64-200f-492f-bad9-dfecd5687698/kube-rbac-proxy/0.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.273762 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-f4trp_f6bc1c64-200f-492f-bad9-dfecd5687698/manager/2.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.385894 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-f4trp_f6bc1c64-200f-492f-bad9-dfecd5687698/manager/1.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.414166 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-qbjp2_28782f20-4534-4137-b590-7a3b31c638b2/operator/1.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.414984 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-qbjp2_28782f20-4534-4137-b590-7a3b31c638b2/operator/2.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.588745 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-bnf27_c3a7c8cb-ac3c-43d3-b38d-0c3625c53196/manager/2.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.603678 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-bnf27_c3a7c8cb-ac3c-43d3-b38d-0c3625c53196/manager/1.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.612344 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-bnf27_c3a7c8cb-ac3c-43d3-b38d-0c3625c53196/kube-rbac-proxy/0.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.695138 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-cwrvs_b9227546-dcce-4b09-9311-19f844deb318/kube-rbac-proxy/0.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.783532 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-cwrvs_b9227546-dcce-4b09-9311-19f844deb318/manager/2.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.841648 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-cwrvs_b9227546-dcce-4b09-9311-19f844deb318/manager/1.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.858869 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-kmpm8_ef0b302b-05d0-4be3-85ad-7eb3d70cec36/kube-rbac-proxy/0.log" Nov 25 13:14:04 crc kubenswrapper[4693]: I1125 13:14:04.945992 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-kmpm8_ef0b302b-05d0-4be3-85ad-7eb3d70cec36/manager/1.log" Nov 25 13:14:05 crc kubenswrapper[4693]: I1125 13:14:05.009016 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-kmpm8_ef0b302b-05d0-4be3-85ad-7eb3d70cec36/manager/0.log" Nov 25 13:14:05 crc kubenswrapper[4693]: I1125 13:14:05.044651 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-tc9jb_105791fd-407d-44a3-8fc8-af90e82b0f63/manager/2.log" Nov 25 13:14:05 crc kubenswrapper[4693]: I1125 13:14:05.048358 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-tc9jb_105791fd-407d-44a3-8fc8-af90e82b0f63/kube-rbac-proxy/0.log" Nov 25 13:14:05 crc kubenswrapper[4693]: I1125 13:14:05.062443 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:14:05 crc kubenswrapper[4693]: I1125 13:14:05.117445 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-82dm8"] Nov 25 13:14:05 crc kubenswrapper[4693]: I1125 13:14:05.166196 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-tc9jb_105791fd-407d-44a3-8fc8-af90e82b0f63/manager/1.log" Nov 25 13:14:07 crc kubenswrapper[4693]: I1125 13:14:07.023574 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-82dm8" podUID="87cbcdc8-d856-4b47-abb9-bbde836b0adf" containerName="registry-server" containerID="cri-o://ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94" gracePeriod=2 Nov 25 13:14:07 crc kubenswrapper[4693]: I1125 13:14:07.562169 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:14:07 crc kubenswrapper[4693]: I1125 13:14:07.645854 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4xhc\" (UniqueName: \"kubernetes.io/projected/87cbcdc8-d856-4b47-abb9-bbde836b0adf-kube-api-access-k4xhc\") pod \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\" (UID: \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\") " Nov 25 13:14:07 crc kubenswrapper[4693]: I1125 13:14:07.646207 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87cbcdc8-d856-4b47-abb9-bbde836b0adf-catalog-content\") pod \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\" (UID: \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\") " Nov 25 13:14:07 crc kubenswrapper[4693]: I1125 13:14:07.646406 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87cbcdc8-d856-4b47-abb9-bbde836b0adf-utilities\") pod \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\" (UID: \"87cbcdc8-d856-4b47-abb9-bbde836b0adf\") " Nov 25 13:14:07 crc kubenswrapper[4693]: I1125 13:14:07.647893 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87cbcdc8-d856-4b47-abb9-bbde836b0adf-utilities" (OuterVolumeSpecName: "utilities") pod "87cbcdc8-d856-4b47-abb9-bbde836b0adf" (UID: "87cbcdc8-d856-4b47-abb9-bbde836b0adf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:14:07 crc kubenswrapper[4693]: I1125 13:14:07.668480 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cbcdc8-d856-4b47-abb9-bbde836b0adf-kube-api-access-k4xhc" (OuterVolumeSpecName: "kube-api-access-k4xhc") pod "87cbcdc8-d856-4b47-abb9-bbde836b0adf" (UID: "87cbcdc8-d856-4b47-abb9-bbde836b0adf"). InnerVolumeSpecName "kube-api-access-k4xhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:14:07 crc kubenswrapper[4693]: I1125 13:14:07.711503 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87cbcdc8-d856-4b47-abb9-bbde836b0adf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87cbcdc8-d856-4b47-abb9-bbde836b0adf" (UID: "87cbcdc8-d856-4b47-abb9-bbde836b0adf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:14:07 crc kubenswrapper[4693]: I1125 13:14:07.749603 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87cbcdc8-d856-4b47-abb9-bbde836b0adf-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 13:14:07 crc kubenswrapper[4693]: I1125 13:14:07.749646 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87cbcdc8-d856-4b47-abb9-bbde836b0adf-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 13:14:07 crc kubenswrapper[4693]: I1125 13:14:07.749660 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4xhc\" (UniqueName: \"kubernetes.io/projected/87cbcdc8-d856-4b47-abb9-bbde836b0adf-kube-api-access-k4xhc\") on node \"crc\" DevicePath \"\"" Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.034781 4693 generic.go:334] "Generic (PLEG): container finished" podID="87cbcdc8-d856-4b47-abb9-bbde836b0adf" containerID="ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94" exitCode=0 Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.034822 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82dm8" event={"ID":"87cbcdc8-d856-4b47-abb9-bbde836b0adf","Type":"ContainerDied","Data":"ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94"} Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.034850 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82dm8" event={"ID":"87cbcdc8-d856-4b47-abb9-bbde836b0adf","Type":"ContainerDied","Data":"1d2ff57820cc01400d6f66d1b0018caf7bf043f4efec6d99c93cd53f177ff914"} Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.034861 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82dm8" Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.034866 4693 scope.go:117] "RemoveContainer" containerID="ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94" Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.057631 4693 scope.go:117] "RemoveContainer" containerID="e4dec904b5f6169613acf7b08587a4114f1cda178082965dbf4fa59ee20f5ca7" Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.074656 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-82dm8"] Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.082892 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-82dm8"] Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.097987 4693 scope.go:117] "RemoveContainer" containerID="143499e43b12dcb6c1818177aba127a860e97cb058230a57ac4e3946a71eae17" Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.141233 4693 scope.go:117] "RemoveContainer" containerID="ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94" Nov 25 13:14:08 crc kubenswrapper[4693]: E1125 13:14:08.145265 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94\": container with ID starting with ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94 not found: ID does not exist" containerID="ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94" Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.146592 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94"} err="failed to get container status \"ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94\": rpc error: code = NotFound desc = could not find container \"ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94\": container with ID starting with ade87ebe69b9975c2fa6f121c1cf51465e829ef814d71086a1fbc34d7f134e94 not found: ID does not exist" Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.146645 4693 scope.go:117] "RemoveContainer" containerID="e4dec904b5f6169613acf7b08587a4114f1cda178082965dbf4fa59ee20f5ca7" Nov 25 13:14:08 crc kubenswrapper[4693]: E1125 13:14:08.147084 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4dec904b5f6169613acf7b08587a4114f1cda178082965dbf4fa59ee20f5ca7\": container with ID starting with e4dec904b5f6169613acf7b08587a4114f1cda178082965dbf4fa59ee20f5ca7 not found: ID does not exist" containerID="e4dec904b5f6169613acf7b08587a4114f1cda178082965dbf4fa59ee20f5ca7" Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.147127 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4dec904b5f6169613acf7b08587a4114f1cda178082965dbf4fa59ee20f5ca7"} err="failed to get container status \"e4dec904b5f6169613acf7b08587a4114f1cda178082965dbf4fa59ee20f5ca7\": rpc error: code = NotFound desc = could not find container \"e4dec904b5f6169613acf7b08587a4114f1cda178082965dbf4fa59ee20f5ca7\": container with ID starting with e4dec904b5f6169613acf7b08587a4114f1cda178082965dbf4fa59ee20f5ca7 not found: ID does not exist" Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.147171 4693 scope.go:117] "RemoveContainer" containerID="143499e43b12dcb6c1818177aba127a860e97cb058230a57ac4e3946a71eae17" Nov 25 13:14:08 crc kubenswrapper[4693]: E1125 13:14:08.147490 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"143499e43b12dcb6c1818177aba127a860e97cb058230a57ac4e3946a71eae17\": container with ID starting with 143499e43b12dcb6c1818177aba127a860e97cb058230a57ac4e3946a71eae17 not found: ID does not exist" containerID="143499e43b12dcb6c1818177aba127a860e97cb058230a57ac4e3946a71eae17" Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.147540 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143499e43b12dcb6c1818177aba127a860e97cb058230a57ac4e3946a71eae17"} err="failed to get container status \"143499e43b12dcb6c1818177aba127a860e97cb058230a57ac4e3946a71eae17\": rpc error: code = NotFound desc = could not find container \"143499e43b12dcb6c1818177aba127a860e97cb058230a57ac4e3946a71eae17\": container with ID starting with 143499e43b12dcb6c1818177aba127a860e97cb058230a57ac4e3946a71eae17 not found: ID does not exist" Nov 25 13:14:08 crc kubenswrapper[4693]: I1125 13:14:08.847285 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cbcdc8-d856-4b47-abb9-bbde836b0adf" path="/var/lib/kubelet/pods/87cbcdc8-d856-4b47-abb9-bbde836b0adf/volumes" Nov 25 13:14:14 crc kubenswrapper[4693]: I1125 13:14:14.812993 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:14:15 crc kubenswrapper[4693]: I1125 13:14:15.109032 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"794dc7bc073fb7a6e7f46a652046f35088cf87ccd0ec815580db9f16fa9c5083"} Nov 25 13:14:23 crc kubenswrapper[4693]: I1125 13:14:23.877567 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-5v4pg_264f1d17-cf59-4dbf-ad2f-0272713fe3b0/control-plane-machine-set-operator/0.log" Nov 25 13:14:24 crc kubenswrapper[4693]: I1125 13:14:24.050183 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mkt5h_bccd7dbe-e658-4ce4-be99-b6642a5bb498/kube-rbac-proxy/0.log" Nov 25 13:14:24 crc kubenswrapper[4693]: I1125 13:14:24.098000 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mkt5h_bccd7dbe-e658-4ce4-be99-b6642a5bb498/machine-api-operator/0.log" Nov 25 13:14:35 crc kubenswrapper[4693]: I1125 13:14:35.685826 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-zc5h2_b9284e69-f82a-44ea-bee3-627c08d1d86c/cert-manager-controller/0.log" Nov 25 13:14:35 crc kubenswrapper[4693]: I1125 13:14:35.891239 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-krwsc_886fc2dd-e1c6-4822-b516-1540c9e77f39/cert-manager-cainjector/1.log" Nov 25 13:14:35 crc kubenswrapper[4693]: I1125 13:14:35.895471 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-krwsc_886fc2dd-e1c6-4822-b516-1540c9e77f39/cert-manager-cainjector/0.log" Nov 25 13:14:35 crc kubenswrapper[4693]: I1125 13:14:35.961654 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-94qfr_6fff8a61-6848-4e20-bc9b-cc0d8e4299d4/cert-manager-webhook/0.log" Nov 25 13:14:47 crc kubenswrapper[4693]: I1125 13:14:47.284282 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-wc25g_fef9b8d4-8a67-486c-84d4-f0053c7efe32/nmstate-console-plugin/0.log" Nov 25 13:14:47 crc kubenswrapper[4693]: I1125 13:14:47.472026 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-k95kj_b2d6c353-42d6-4c35-8c14-925f97540979/nmstate-handler/0.log" Nov 25 13:14:47 crc kubenswrapper[4693]: I1125 13:14:47.512176 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-sxzv8_0a6d6078-b39e-4528-a765-5624dee71294/nmstate-metrics/0.log" Nov 25 13:14:47 crc kubenswrapper[4693]: I1125 13:14:47.554096 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-sxzv8_0a6d6078-b39e-4528-a765-5624dee71294/kube-rbac-proxy/0.log" Nov 25 13:14:47 crc kubenswrapper[4693]: I1125 13:14:47.675437 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-vv6jm_d8ae7877-8f8c-4fb0-bb42-ec809dcb6d4d/nmstate-operator/0.log" Nov 25 13:14:47 crc kubenswrapper[4693]: I1125 13:14:47.773631 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-wv8pz_27d9ec74-a9f1-4971-a6ad-16703ad324ad/nmstate-webhook/0.log" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.153076 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns"] Nov 25 13:15:00 crc kubenswrapper[4693]: E1125 13:15:00.154258 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87cbcdc8-d856-4b47-abb9-bbde836b0adf" containerName="registry-server" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.154274 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="87cbcdc8-d856-4b47-abb9-bbde836b0adf" containerName="registry-server" Nov 25 13:15:00 crc kubenswrapper[4693]: E1125 13:15:00.154289 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87cbcdc8-d856-4b47-abb9-bbde836b0adf" containerName="extract-utilities" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.154297 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="87cbcdc8-d856-4b47-abb9-bbde836b0adf" containerName="extract-utilities" Nov 25 13:15:00 crc kubenswrapper[4693]: E1125 13:15:00.154319 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87cbcdc8-d856-4b47-abb9-bbde836b0adf" containerName="extract-content" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.154327 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="87cbcdc8-d856-4b47-abb9-bbde836b0adf" containerName="extract-content" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.154656 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="87cbcdc8-d856-4b47-abb9-bbde836b0adf" containerName="registry-server" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.155529 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.164172 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns"] Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.164728 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.164875 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.267832 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhmz7\" (UniqueName: \"kubernetes.io/projected/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-kube-api-access-zhmz7\") pod \"collect-profiles-29401275-g2vns\" (UID: \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.268341 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-config-volume\") pod \"collect-profiles-29401275-g2vns\" (UID: \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.268718 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-secret-volume\") pod \"collect-profiles-29401275-g2vns\" (UID: \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.370335 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-secret-volume\") pod \"collect-profiles-29401275-g2vns\" (UID: \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.370433 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhmz7\" (UniqueName: \"kubernetes.io/projected/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-kube-api-access-zhmz7\") pod \"collect-profiles-29401275-g2vns\" (UID: \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.370456 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-config-volume\") pod \"collect-profiles-29401275-g2vns\" (UID: \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.371770 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-config-volume\") pod \"collect-profiles-29401275-g2vns\" (UID: \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.521068 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhmz7\" (UniqueName: \"kubernetes.io/projected/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-kube-api-access-zhmz7\") pod \"collect-profiles-29401275-g2vns\" (UID: \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.525454 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-secret-volume\") pod \"collect-profiles-29401275-g2vns\" (UID: \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:00 crc kubenswrapper[4693]: I1125 13:15:00.779275 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.104100 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-m86lr_89cc79c1-2d72-47b8-abcb-14af4fb9afe7/kube-rbac-proxy/0.log" Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.213518 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-m86lr_89cc79c1-2d72-47b8-abcb-14af4fb9afe7/controller/0.log" Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.255939 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns"] Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.339424 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-frr-files/0.log" Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.525276 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-metrics/0.log" Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.528193 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-reloader/0.log" Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.549140 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" event={"ID":"f3e571fc-1550-4d54-b8bf-b54aff04fe6e","Type":"ContainerStarted","Data":"e8ddaa5b1bb7089abcd618acddf666c2346aa201af9dd9e8156f82ec19c6f85f"} Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.549183 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" event={"ID":"f3e571fc-1550-4d54-b8bf-b54aff04fe6e","Type":"ContainerStarted","Data":"b30636f60a1fa117a4ed6be6c1d7fa62fb6db82a6f899741bf7c8d85c75ed8b3"} Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.574315 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" podStartSLOduration=1.574291413 podStartE2EDuration="1.574291413s" podCreationTimestamp="2025-11-25 13:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 13:15:01.562515224 +0000 UTC m=+4021.480600615" watchObservedRunningTime="2025-11-25 13:15:01.574291413 +0000 UTC m=+4021.492376784" Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.602275 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-frr-files/0.log" Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.629219 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-reloader/0.log" Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.769131 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-metrics/0.log" Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.769288 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-frr-files/0.log" Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.808951 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-reloader/0.log" Nov 25 13:15:01 crc kubenswrapper[4693]: I1125 13:15:01.842604 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-metrics/0.log" Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.054141 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-reloader/0.log" Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.064500 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-metrics/0.log" Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.069176 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-frr-files/0.log" Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.083855 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/controller/0.log" Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.226571 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/frr-metrics/0.log" Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.251413 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/kube-rbac-proxy/0.log" Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.337035 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/kube-rbac-proxy-frr/0.log" Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.476661 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/reloader/0.log" Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.559199 4693 generic.go:334] "Generic (PLEG): container finished" podID="f3e571fc-1550-4d54-b8bf-b54aff04fe6e" containerID="e8ddaa5b1bb7089abcd618acddf666c2346aa201af9dd9e8156f82ec19c6f85f" exitCode=0 Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.559245 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" event={"ID":"f3e571fc-1550-4d54-b8bf-b54aff04fe6e","Type":"ContainerDied","Data":"e8ddaa5b1bb7089abcd618acddf666c2346aa201af9dd9e8156f82ec19c6f85f"} Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.606140 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-54csl_b92f94fa-96a8-4257-890e-076b4292b487/frr-k8s-webhook-server/0.log" Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.723266 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5995bbfc5f-c8gkc_0d2b9e6f-fe11-47e3-af7b-cca0fff65798/manager/3.log" Nov 25 13:15:02 crc kubenswrapper[4693]: I1125 13:15:02.838483 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5995bbfc5f-c8gkc_0d2b9e6f-fe11-47e3-af7b-cca0fff65798/manager/2.log" Nov 25 13:15:03 crc kubenswrapper[4693]: I1125 13:15:03.015227 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7b574576ff-z9ftm_9c9e4728-76ad-4ae9-8ef9-87cff7db96c3/webhook-server/0.log" Nov 25 13:15:03 crc kubenswrapper[4693]: I1125 13:15:03.197928 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dnzwb_667fdb6a-4e0e-4b92-ae50-aa1880c69402/kube-rbac-proxy/0.log" Nov 25 13:15:03 crc kubenswrapper[4693]: I1125 13:15:03.774174 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/frr/0.log" Nov 25 13:15:03 crc kubenswrapper[4693]: I1125 13:15:03.854694 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dnzwb_667fdb6a-4e0e-4b92-ae50-aa1880c69402/speaker/0.log" Nov 25 13:15:03 crc kubenswrapper[4693]: I1125 13:15:03.986057 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.141791 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-secret-volume\") pod \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\" (UID: \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\") " Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.142042 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhmz7\" (UniqueName: \"kubernetes.io/projected/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-kube-api-access-zhmz7\") pod \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\" (UID: \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\") " Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.142116 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-config-volume\") pod \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\" (UID: \"f3e571fc-1550-4d54-b8bf-b54aff04fe6e\") " Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.142794 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-config-volume" (OuterVolumeSpecName: "config-volume") pod "f3e571fc-1550-4d54-b8bf-b54aff04fe6e" (UID: "f3e571fc-1550-4d54-b8bf-b54aff04fe6e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.150249 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f3e571fc-1550-4d54-b8bf-b54aff04fe6e" (UID: "f3e571fc-1550-4d54-b8bf-b54aff04fe6e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.150367 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-kube-api-access-zhmz7" (OuterVolumeSpecName: "kube-api-access-zhmz7") pod "f3e571fc-1550-4d54-b8bf-b54aff04fe6e" (UID: "f3e571fc-1550-4d54-b8bf-b54aff04fe6e"). InnerVolumeSpecName "kube-api-access-zhmz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.243821 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.243895 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.243907 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhmz7\" (UniqueName: \"kubernetes.io/projected/f3e571fc-1550-4d54-b8bf-b54aff04fe6e-kube-api-access-zhmz7\") on node \"crc\" DevicePath \"\"" Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.320461 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb"] Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.329322 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401230-jgcvb"] Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.588244 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" event={"ID":"f3e571fc-1550-4d54-b8bf-b54aff04fe6e","Type":"ContainerDied","Data":"b30636f60a1fa117a4ed6be6c1d7fa62fb6db82a6f899741bf7c8d85c75ed8b3"} Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.588284 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b30636f60a1fa117a4ed6be6c1d7fa62fb6db82a6f899741bf7c8d85c75ed8b3" Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.588335 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401275-g2vns" Nov 25 13:15:04 crc kubenswrapper[4693]: I1125 13:15:04.822098 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8714946b-3179-48cb-b0d4-9be5bbd4d3a5" path="/var/lib/kubelet/pods/8714946b-3179-48cb-b0d4-9be5bbd4d3a5/volumes" Nov 25 13:15:15 crc kubenswrapper[4693]: I1125 13:15:15.283621 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/util/0.log" Nov 25 13:15:15 crc kubenswrapper[4693]: I1125 13:15:15.448288 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/util/0.log" Nov 25 13:15:15 crc kubenswrapper[4693]: I1125 13:15:15.488593 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/pull/0.log" Nov 25 13:15:15 crc kubenswrapper[4693]: I1125 13:15:15.489742 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/pull/0.log" Nov 25 13:15:15 crc kubenswrapper[4693]: I1125 13:15:15.692444 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/util/0.log" Nov 25 13:15:15 crc kubenswrapper[4693]: I1125 13:15:15.695645 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/pull/0.log" Nov 25 13:15:16 crc kubenswrapper[4693]: I1125 13:15:16.292462 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/extract-utilities/0.log" Nov 25 13:15:16 crc kubenswrapper[4693]: I1125 13:15:16.332650 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/extract/0.log" Nov 25 13:15:16 crc kubenswrapper[4693]: I1125 13:15:16.488151 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/extract-utilities/0.log" Nov 25 13:15:16 crc kubenswrapper[4693]: I1125 13:15:16.494625 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/extract-content/0.log" Nov 25 13:15:16 crc kubenswrapper[4693]: I1125 13:15:16.517110 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/extract-content/0.log" Nov 25 13:15:16 crc kubenswrapper[4693]: I1125 13:15:16.707477 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/extract-utilities/0.log" Nov 25 13:15:16 crc kubenswrapper[4693]: I1125 13:15:16.780556 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/extract-content/0.log" Nov 25 13:15:16 crc kubenswrapper[4693]: I1125 13:15:16.927225 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/extract-utilities/0.log" Nov 25 13:15:17 crc kubenswrapper[4693]: I1125 13:15:17.115839 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/extract-content/0.log" Nov 25 13:15:17 crc kubenswrapper[4693]: I1125 13:15:17.126864 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/extract-content/0.log" Nov 25 13:15:17 crc kubenswrapper[4693]: I1125 13:15:17.158047 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/extract-utilities/0.log" Nov 25 13:15:17 crc kubenswrapper[4693]: I1125 13:15:17.352573 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/extract-content/0.log" Nov 25 13:15:17 crc kubenswrapper[4693]: I1125 13:15:17.411657 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/registry-server/0.log" Nov 25 13:15:17 crc kubenswrapper[4693]: I1125 13:15:17.523057 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/extract-utilities/0.log" Nov 25 13:15:17 crc kubenswrapper[4693]: I1125 13:15:17.631469 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/util/0.log" Nov 25 13:15:17 crc kubenswrapper[4693]: I1125 13:15:17.880485 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/pull/0.log" Nov 25 13:15:17 crc kubenswrapper[4693]: I1125 13:15:17.880552 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/pull/0.log" Nov 25 13:15:17 crc kubenswrapper[4693]: I1125 13:15:17.927294 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/util/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.085534 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/registry-server/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.143227 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/pull/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.144561 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/extract/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.154107 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/util/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.313433 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-87z2l_c97ed1b2-4d1e-45f8-9aa7-67336324d2cc/marketplace-operator/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.356427 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhsc2_0c22ae76-cd45-47d2-bc53-c9e3d6026512/extract-utilities/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.564450 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhsc2_0c22ae76-cd45-47d2-bc53-c9e3d6026512/extract-utilities/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.570018 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhsc2_0c22ae76-cd45-47d2-bc53-c9e3d6026512/extract-content/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.583460 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhsc2_0c22ae76-cd45-47d2-bc53-c9e3d6026512/extract-content/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.726904 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhsc2_0c22ae76-cd45-47d2-bc53-c9e3d6026512/extract-content/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.798517 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhsc2_0c22ae76-cd45-47d2-bc53-c9e3d6026512/extract-utilities/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.887963 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhsc2_0c22ae76-cd45-47d2-bc53-c9e3d6026512/registry-server/0.log" Nov 25 13:15:18 crc kubenswrapper[4693]: I1125 13:15:18.917819 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/extract-utilities/0.log" Nov 25 13:15:19 crc kubenswrapper[4693]: I1125 13:15:19.093643 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/extract-content/0.log" Nov 25 13:15:19 crc kubenswrapper[4693]: I1125 13:15:19.109911 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/extract-utilities/0.log" Nov 25 13:15:19 crc kubenswrapper[4693]: I1125 13:15:19.123727 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/extract-content/0.log" Nov 25 13:15:19 crc kubenswrapper[4693]: I1125 13:15:19.279528 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/extract-content/0.log" Nov 25 13:15:19 crc kubenswrapper[4693]: I1125 13:15:19.297648 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/extract-utilities/0.log" Nov 25 13:15:19 crc kubenswrapper[4693]: I1125 13:15:19.445852 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/registry-server/0.log" Nov 25 13:15:29 crc kubenswrapper[4693]: I1125 13:15:29.500188 4693 scope.go:117] "RemoveContainer" containerID="90a3cbd8ef370cf4a178b6763cbbc8f14b25f6dd3c453e63b4357f7ec542f040" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.706358 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lm42v"] Nov 25 13:16:30 crc kubenswrapper[4693]: E1125 13:16:30.707404 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e571fc-1550-4d54-b8bf-b54aff04fe6e" containerName="collect-profiles" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.707416 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e571fc-1550-4d54-b8bf-b54aff04fe6e" containerName="collect-profiles" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.707667 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e571fc-1550-4d54-b8bf-b54aff04fe6e" containerName="collect-profiles" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.709008 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.732682 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lm42v"] Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.822167 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7xp6\" (UniqueName: \"kubernetes.io/projected/f1511af3-3ca9-439c-b941-18a792f99932-kube-api-access-x7xp6\") pod \"redhat-marketplace-lm42v\" (UID: \"f1511af3-3ca9-439c-b941-18a792f99932\") " pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.822743 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1511af3-3ca9-439c-b941-18a792f99932-catalog-content\") pod \"redhat-marketplace-lm42v\" (UID: \"f1511af3-3ca9-439c-b941-18a792f99932\") " pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.822843 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1511af3-3ca9-439c-b941-18a792f99932-utilities\") pod \"redhat-marketplace-lm42v\" (UID: \"f1511af3-3ca9-439c-b941-18a792f99932\") " pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.924102 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1511af3-3ca9-439c-b941-18a792f99932-utilities\") pod \"redhat-marketplace-lm42v\" (UID: \"f1511af3-3ca9-439c-b941-18a792f99932\") " pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.924248 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7xp6\" (UniqueName: \"kubernetes.io/projected/f1511af3-3ca9-439c-b941-18a792f99932-kube-api-access-x7xp6\") pod \"redhat-marketplace-lm42v\" (UID: \"f1511af3-3ca9-439c-b941-18a792f99932\") " pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.924400 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1511af3-3ca9-439c-b941-18a792f99932-catalog-content\") pod \"redhat-marketplace-lm42v\" (UID: \"f1511af3-3ca9-439c-b941-18a792f99932\") " pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.924736 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1511af3-3ca9-439c-b941-18a792f99932-utilities\") pod \"redhat-marketplace-lm42v\" (UID: \"f1511af3-3ca9-439c-b941-18a792f99932\") " pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.925241 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1511af3-3ca9-439c-b941-18a792f99932-catalog-content\") pod \"redhat-marketplace-lm42v\" (UID: \"f1511af3-3ca9-439c-b941-18a792f99932\") " pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:30 crc kubenswrapper[4693]: I1125 13:16:30.953078 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7xp6\" (UniqueName: \"kubernetes.io/projected/f1511af3-3ca9-439c-b941-18a792f99932-kube-api-access-x7xp6\") pod \"redhat-marketplace-lm42v\" (UID: \"f1511af3-3ca9-439c-b941-18a792f99932\") " pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:31 crc kubenswrapper[4693]: I1125 13:16:31.030454 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:31 crc kubenswrapper[4693]: I1125 13:16:31.601307 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lm42v"] Nov 25 13:16:32 crc kubenswrapper[4693]: I1125 13:16:32.481004 4693 generic.go:334] "Generic (PLEG): container finished" podID="f1511af3-3ca9-439c-b941-18a792f99932" containerID="67528dc499642bc3f0607a42f08caaf900abe6719b4c66ec46977cc5b3bc728a" exitCode=0 Nov 25 13:16:32 crc kubenswrapper[4693]: I1125 13:16:32.481075 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lm42v" event={"ID":"f1511af3-3ca9-439c-b941-18a792f99932","Type":"ContainerDied","Data":"67528dc499642bc3f0607a42f08caaf900abe6719b4c66ec46977cc5b3bc728a"} Nov 25 13:16:32 crc kubenswrapper[4693]: I1125 13:16:32.481801 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lm42v" event={"ID":"f1511af3-3ca9-439c-b941-18a792f99932","Type":"ContainerStarted","Data":"919f72e6f94f9369676898883d282da03105ea2e4e1ace1abc021cf413af85c8"} Nov 25 13:16:32 crc kubenswrapper[4693]: I1125 13:16:32.483956 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 13:16:35 crc kubenswrapper[4693]: I1125 13:16:35.114043 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:16:35 crc kubenswrapper[4693]: I1125 13:16:35.114705 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:16:38 crc kubenswrapper[4693]: I1125 13:16:38.551709 4693 generic.go:334] "Generic (PLEG): container finished" podID="f1511af3-3ca9-439c-b941-18a792f99932" containerID="f1b5b5625a818ef8a6365e6a6beb1d7ae13e986101c628c215ff2adb6d57a697" exitCode=0 Nov 25 13:16:38 crc kubenswrapper[4693]: I1125 13:16:38.551944 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lm42v" event={"ID":"f1511af3-3ca9-439c-b941-18a792f99932","Type":"ContainerDied","Data":"f1b5b5625a818ef8a6365e6a6beb1d7ae13e986101c628c215ff2adb6d57a697"} Nov 25 13:16:40 crc kubenswrapper[4693]: I1125 13:16:40.577182 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lm42v" event={"ID":"f1511af3-3ca9-439c-b941-18a792f99932","Type":"ContainerStarted","Data":"87c8590e963811580c06f7f794b44fbfa267df63a8a6d8627439dc036355f497"} Nov 25 13:16:40 crc kubenswrapper[4693]: I1125 13:16:40.603688 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lm42v" podStartSLOduration=3.438995569 podStartE2EDuration="10.60366301s" podCreationTimestamp="2025-11-25 13:16:30 +0000 UTC" firstStartedPulling="2025-11-25 13:16:32.483422701 +0000 UTC m=+4112.401508112" lastFinishedPulling="2025-11-25 13:16:39.648090172 +0000 UTC m=+4119.566175553" observedRunningTime="2025-11-25 13:16:40.592477577 +0000 UTC m=+4120.510562958" watchObservedRunningTime="2025-11-25 13:16:40.60366301 +0000 UTC m=+4120.521748381" Nov 25 13:16:41 crc kubenswrapper[4693]: I1125 13:16:41.031628 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:41 crc kubenswrapper[4693]: I1125 13:16:41.031962 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:42 crc kubenswrapper[4693]: I1125 13:16:42.078132 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-lm42v" podUID="f1511af3-3ca9-439c-b941-18a792f99932" containerName="registry-server" probeResult="failure" output=< Nov 25 13:16:42 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 13:16:42 crc kubenswrapper[4693]: > Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.079823 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.133280 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lm42v" Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.249629 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lm42v"] Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.323249 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhsc2"] Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.323515 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rhsc2" podUID="0c22ae76-cd45-47d2-bc53-c9e3d6026512" containerName="registry-server" containerID="cri-o://99aedac81dfc49e5e8b6f52a559bd21dcf08f78361b31a5b595fa450b6d9b912" gracePeriod=2 Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.694686 4693 generic.go:334] "Generic (PLEG): container finished" podID="0c22ae76-cd45-47d2-bc53-c9e3d6026512" containerID="99aedac81dfc49e5e8b6f52a559bd21dcf08f78361b31a5b595fa450b6d9b912" exitCode=0 Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.694747 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhsc2" event={"ID":"0c22ae76-cd45-47d2-bc53-c9e3d6026512","Type":"ContainerDied","Data":"99aedac81dfc49e5e8b6f52a559bd21dcf08f78361b31a5b595fa450b6d9b912"} Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.880960 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.972550 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c22ae76-cd45-47d2-bc53-c9e3d6026512-catalog-content\") pod \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\" (UID: \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\") " Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.972807 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c22ae76-cd45-47d2-bc53-c9e3d6026512-utilities\") pod \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\" (UID: \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\") " Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.972877 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2s47\" (UniqueName: \"kubernetes.io/projected/0c22ae76-cd45-47d2-bc53-c9e3d6026512-kube-api-access-j2s47\") pod \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\" (UID: \"0c22ae76-cd45-47d2-bc53-c9e3d6026512\") " Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.974342 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c22ae76-cd45-47d2-bc53-c9e3d6026512-utilities" (OuterVolumeSpecName: "utilities") pod "0c22ae76-cd45-47d2-bc53-c9e3d6026512" (UID: "0c22ae76-cd45-47d2-bc53-c9e3d6026512"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:16:51 crc kubenswrapper[4693]: I1125 13:16:51.987338 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c22ae76-cd45-47d2-bc53-c9e3d6026512-kube-api-access-j2s47" (OuterVolumeSpecName: "kube-api-access-j2s47") pod "0c22ae76-cd45-47d2-bc53-c9e3d6026512" (UID: "0c22ae76-cd45-47d2-bc53-c9e3d6026512"). InnerVolumeSpecName "kube-api-access-j2s47". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:16:52 crc kubenswrapper[4693]: I1125 13:16:52.075011 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c22ae76-cd45-47d2-bc53-c9e3d6026512-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 13:16:52 crc kubenswrapper[4693]: I1125 13:16:52.075050 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2s47\" (UniqueName: \"kubernetes.io/projected/0c22ae76-cd45-47d2-bc53-c9e3d6026512-kube-api-access-j2s47\") on node \"crc\" DevicePath \"\"" Nov 25 13:16:52 crc kubenswrapper[4693]: I1125 13:16:52.275555 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c22ae76-cd45-47d2-bc53-c9e3d6026512-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c22ae76-cd45-47d2-bc53-c9e3d6026512" (UID: "0c22ae76-cd45-47d2-bc53-c9e3d6026512"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:16:52 crc kubenswrapper[4693]: I1125 13:16:52.279905 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c22ae76-cd45-47d2-bc53-c9e3d6026512-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 13:16:52 crc kubenswrapper[4693]: I1125 13:16:52.714052 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhsc2" event={"ID":"0c22ae76-cd45-47d2-bc53-c9e3d6026512","Type":"ContainerDied","Data":"e7a321d24a2f09e286386ca9fc534e4bef7f5d61805627ad0a7af4f2894ef904"} Nov 25 13:16:52 crc kubenswrapper[4693]: I1125 13:16:52.714134 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhsc2" Nov 25 13:16:52 crc kubenswrapper[4693]: I1125 13:16:52.714139 4693 scope.go:117] "RemoveContainer" containerID="99aedac81dfc49e5e8b6f52a559bd21dcf08f78361b31a5b595fa450b6d9b912" Nov 25 13:16:52 crc kubenswrapper[4693]: I1125 13:16:52.747859 4693 scope.go:117] "RemoveContainer" containerID="ad56b976e35142af1d5a3b84d9b3ce3b60744f85e34549b73e4d4dc094eaaf1c" Nov 25 13:16:52 crc kubenswrapper[4693]: I1125 13:16:52.762117 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhsc2"] Nov 25 13:16:52 crc kubenswrapper[4693]: I1125 13:16:52.772844 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhsc2"] Nov 25 13:16:52 crc kubenswrapper[4693]: I1125 13:16:52.776601 4693 scope.go:117] "RemoveContainer" containerID="fb8cbfe6030020fe580364e910ad733bbfe900552e91877a4b69c29b053b302c" Nov 25 13:16:52 crc kubenswrapper[4693]: I1125 13:16:52.823980 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c22ae76-cd45-47d2-bc53-c9e3d6026512" path="/var/lib/kubelet/pods/0c22ae76-cd45-47d2-bc53-c9e3d6026512/volumes" Nov 25 13:17:02 crc kubenswrapper[4693]: I1125 13:17:02.816155 4693 generic.go:334] "Generic (PLEG): container finished" podID="dc8f0717-6d45-4672-b4c6-3fb3011972eb" containerID="5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e" exitCode=0 Nov 25 13:17:02 crc kubenswrapper[4693]: I1125 13:17:02.828057 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8svzn/must-gather-zvnp5" event={"ID":"dc8f0717-6d45-4672-b4c6-3fb3011972eb","Type":"ContainerDied","Data":"5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e"} Nov 25 13:17:02 crc kubenswrapper[4693]: I1125 13:17:02.828664 4693 scope.go:117] "RemoveContainer" containerID="5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e" Nov 25 13:17:03 crc kubenswrapper[4693]: I1125 13:17:03.262408 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8svzn_must-gather-zvnp5_dc8f0717-6d45-4672-b4c6-3fb3011972eb/gather/0.log" Nov 25 13:17:05 crc kubenswrapper[4693]: I1125 13:17:05.114335 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:17:05 crc kubenswrapper[4693]: I1125 13:17:05.114896 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.040092 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8svzn/must-gather-zvnp5"] Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.041034 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8svzn/must-gather-zvnp5" podUID="dc8f0717-6d45-4672-b4c6-3fb3011972eb" containerName="copy" containerID="cri-o://b061556706b428a143b7a6fc7f15541d2232190b40849002afa0b538843f15f5" gracePeriod=2 Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.052026 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8svzn/must-gather-zvnp5"] Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.733945 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8svzn_must-gather-zvnp5_dc8f0717-6d45-4672-b4c6-3fb3011972eb/copy/0.log" Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.734601 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/must-gather-zvnp5" Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.883816 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzbxm\" (UniqueName: \"kubernetes.io/projected/dc8f0717-6d45-4672-b4c6-3fb3011972eb-kube-api-access-fzbxm\") pod \"dc8f0717-6d45-4672-b4c6-3fb3011972eb\" (UID: \"dc8f0717-6d45-4672-b4c6-3fb3011972eb\") " Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.885241 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dc8f0717-6d45-4672-b4c6-3fb3011972eb-must-gather-output\") pod \"dc8f0717-6d45-4672-b4c6-3fb3011972eb\" (UID: \"dc8f0717-6d45-4672-b4c6-3fb3011972eb\") " Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.899608 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc8f0717-6d45-4672-b4c6-3fb3011972eb-kube-api-access-fzbxm" (OuterVolumeSpecName: "kube-api-access-fzbxm") pod "dc8f0717-6d45-4672-b4c6-3fb3011972eb" (UID: "dc8f0717-6d45-4672-b4c6-3fb3011972eb"). InnerVolumeSpecName "kube-api-access-fzbxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.918553 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8svzn_must-gather-zvnp5_dc8f0717-6d45-4672-b4c6-3fb3011972eb/copy/0.log" Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.918989 4693 generic.go:334] "Generic (PLEG): container finished" podID="dc8f0717-6d45-4672-b4c6-3fb3011972eb" containerID="b061556706b428a143b7a6fc7f15541d2232190b40849002afa0b538843f15f5" exitCode=143 Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.919034 4693 scope.go:117] "RemoveContainer" containerID="b061556706b428a143b7a6fc7f15541d2232190b40849002afa0b538843f15f5" Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.919131 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8svzn/must-gather-zvnp5" Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.978895 4693 scope.go:117] "RemoveContainer" containerID="5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e" Nov 25 13:17:11 crc kubenswrapper[4693]: I1125 13:17:11.988698 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzbxm\" (UniqueName: \"kubernetes.io/projected/dc8f0717-6d45-4672-b4c6-3fb3011972eb-kube-api-access-fzbxm\") on node \"crc\" DevicePath \"\"" Nov 25 13:17:12 crc kubenswrapper[4693]: I1125 13:17:12.043436 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc8f0717-6d45-4672-b4c6-3fb3011972eb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dc8f0717-6d45-4672-b4c6-3fb3011972eb" (UID: "dc8f0717-6d45-4672-b4c6-3fb3011972eb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:17:12 crc kubenswrapper[4693]: I1125 13:17:12.059547 4693 scope.go:117] "RemoveContainer" containerID="b061556706b428a143b7a6fc7f15541d2232190b40849002afa0b538843f15f5" Nov 25 13:17:12 crc kubenswrapper[4693]: E1125 13:17:12.062262 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b061556706b428a143b7a6fc7f15541d2232190b40849002afa0b538843f15f5\": container with ID starting with b061556706b428a143b7a6fc7f15541d2232190b40849002afa0b538843f15f5 not found: ID does not exist" containerID="b061556706b428a143b7a6fc7f15541d2232190b40849002afa0b538843f15f5" Nov 25 13:17:12 crc kubenswrapper[4693]: I1125 13:17:12.062308 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b061556706b428a143b7a6fc7f15541d2232190b40849002afa0b538843f15f5"} err="failed to get container status \"b061556706b428a143b7a6fc7f15541d2232190b40849002afa0b538843f15f5\": rpc error: code = NotFound desc = could not find container \"b061556706b428a143b7a6fc7f15541d2232190b40849002afa0b538843f15f5\": container with ID starting with b061556706b428a143b7a6fc7f15541d2232190b40849002afa0b538843f15f5 not found: ID does not exist" Nov 25 13:17:12 crc kubenswrapper[4693]: I1125 13:17:12.062359 4693 scope.go:117] "RemoveContainer" containerID="5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e" Nov 25 13:17:12 crc kubenswrapper[4693]: E1125 13:17:12.064433 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e\": container with ID starting with 5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e not found: ID does not exist" containerID="5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e" Nov 25 13:17:12 crc kubenswrapper[4693]: I1125 13:17:12.064489 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e"} err="failed to get container status \"5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e\": rpc error: code = NotFound desc = could not find container \"5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e\": container with ID starting with 5bf4f087c502ed171d2db8b34dbb1f0c3393a128b506b01857285a049fe17e4e not found: ID does not exist" Nov 25 13:17:12 crc kubenswrapper[4693]: I1125 13:17:12.090665 4693 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dc8f0717-6d45-4672-b4c6-3fb3011972eb-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 25 13:17:12 crc kubenswrapper[4693]: I1125 13:17:12.828339 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc8f0717-6d45-4672-b4c6-3fb3011972eb" path="/var/lib/kubelet/pods/dc8f0717-6d45-4672-b4c6-3fb3011972eb/volumes" Nov 25 13:17:35 crc kubenswrapper[4693]: I1125 13:17:35.113667 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:17:35 crc kubenswrapper[4693]: I1125 13:17:35.114355 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:17:35 crc kubenswrapper[4693]: I1125 13:17:35.114427 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 13:17:35 crc kubenswrapper[4693]: I1125 13:17:35.115420 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"794dc7bc073fb7a6e7f46a652046f35088cf87ccd0ec815580db9f16fa9c5083"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 13:17:35 crc kubenswrapper[4693]: I1125 13:17:35.115540 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://794dc7bc073fb7a6e7f46a652046f35088cf87ccd0ec815580db9f16fa9c5083" gracePeriod=600 Nov 25 13:17:36 crc kubenswrapper[4693]: I1125 13:17:36.183204 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="794dc7bc073fb7a6e7f46a652046f35088cf87ccd0ec815580db9f16fa9c5083" exitCode=0 Nov 25 13:17:36 crc kubenswrapper[4693]: I1125 13:17:36.183269 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"794dc7bc073fb7a6e7f46a652046f35088cf87ccd0ec815580db9f16fa9c5083"} Nov 25 13:17:36 crc kubenswrapper[4693]: I1125 13:17:36.184261 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf"} Nov 25 13:17:36 crc kubenswrapper[4693]: I1125 13:17:36.184307 4693 scope.go:117] "RemoveContainer" containerID="9f1f6b08e4a8b545cba45cd0fb2e7e0c655ac4fa810dc6a9d3107e9291dfb535" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.497990 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gkzch"] Nov 25 13:19:25 crc kubenswrapper[4693]: E1125 13:19:25.499139 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8f0717-6d45-4672-b4c6-3fb3011972eb" containerName="copy" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.499154 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8f0717-6d45-4672-b4c6-3fb3011972eb" containerName="copy" Nov 25 13:19:25 crc kubenswrapper[4693]: E1125 13:19:25.499178 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c22ae76-cd45-47d2-bc53-c9e3d6026512" containerName="registry-server" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.499186 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c22ae76-cd45-47d2-bc53-c9e3d6026512" containerName="registry-server" Nov 25 13:19:25 crc kubenswrapper[4693]: E1125 13:19:25.499200 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c22ae76-cd45-47d2-bc53-c9e3d6026512" containerName="extract-utilities" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.499208 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c22ae76-cd45-47d2-bc53-c9e3d6026512" containerName="extract-utilities" Nov 25 13:19:25 crc kubenswrapper[4693]: E1125 13:19:25.499219 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c22ae76-cd45-47d2-bc53-c9e3d6026512" containerName="extract-content" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.499228 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c22ae76-cd45-47d2-bc53-c9e3d6026512" containerName="extract-content" Nov 25 13:19:25 crc kubenswrapper[4693]: E1125 13:19:25.499267 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8f0717-6d45-4672-b4c6-3fb3011972eb" containerName="gather" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.499275 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8f0717-6d45-4672-b4c6-3fb3011972eb" containerName="gather" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.499610 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8f0717-6d45-4672-b4c6-3fb3011972eb" containerName="copy" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.499631 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8f0717-6d45-4672-b4c6-3fb3011972eb" containerName="gather" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.499647 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c22ae76-cd45-47d2-bc53-c9e3d6026512" containerName="registry-server" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.501388 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.511816 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gkzch"] Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.594880 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faba52c6-cbea-45df-a598-5c47cde7c00a-catalog-content\") pod \"certified-operators-gkzch\" (UID: \"faba52c6-cbea-45df-a598-5c47cde7c00a\") " pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.594929 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faba52c6-cbea-45df-a598-5c47cde7c00a-utilities\") pod \"certified-operators-gkzch\" (UID: \"faba52c6-cbea-45df-a598-5c47cde7c00a\") " pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.594983 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhctm\" (UniqueName: \"kubernetes.io/projected/faba52c6-cbea-45df-a598-5c47cde7c00a-kube-api-access-nhctm\") pod \"certified-operators-gkzch\" (UID: \"faba52c6-cbea-45df-a598-5c47cde7c00a\") " pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.696807 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faba52c6-cbea-45df-a598-5c47cde7c00a-catalog-content\") pod \"certified-operators-gkzch\" (UID: \"faba52c6-cbea-45df-a598-5c47cde7c00a\") " pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.696863 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faba52c6-cbea-45df-a598-5c47cde7c00a-utilities\") pod \"certified-operators-gkzch\" (UID: \"faba52c6-cbea-45df-a598-5c47cde7c00a\") " pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.697149 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhctm\" (UniqueName: \"kubernetes.io/projected/faba52c6-cbea-45df-a598-5c47cde7c00a-kube-api-access-nhctm\") pod \"certified-operators-gkzch\" (UID: \"faba52c6-cbea-45df-a598-5c47cde7c00a\") " pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.697807 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faba52c6-cbea-45df-a598-5c47cde7c00a-utilities\") pod \"certified-operators-gkzch\" (UID: \"faba52c6-cbea-45df-a598-5c47cde7c00a\") " pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.697842 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faba52c6-cbea-45df-a598-5c47cde7c00a-catalog-content\") pod \"certified-operators-gkzch\" (UID: \"faba52c6-cbea-45df-a598-5c47cde7c00a\") " pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.724115 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhctm\" (UniqueName: \"kubernetes.io/projected/faba52c6-cbea-45df-a598-5c47cde7c00a-kube-api-access-nhctm\") pod \"certified-operators-gkzch\" (UID: \"faba52c6-cbea-45df-a598-5c47cde7c00a\") " pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:25 crc kubenswrapper[4693]: I1125 13:19:25.830156 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:26 crc kubenswrapper[4693]: I1125 13:19:26.351546 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gkzch"] Nov 25 13:19:27 crc kubenswrapper[4693]: I1125 13:19:27.227488 4693 generic.go:334] "Generic (PLEG): container finished" podID="faba52c6-cbea-45df-a598-5c47cde7c00a" containerID="4ed83c80de61ccddebf775629f24db618f18a289c7aeaecb10391dbdc20fafc5" exitCode=0 Nov 25 13:19:27 crc kubenswrapper[4693]: I1125 13:19:27.227643 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzch" event={"ID":"faba52c6-cbea-45df-a598-5c47cde7c00a","Type":"ContainerDied","Data":"4ed83c80de61ccddebf775629f24db618f18a289c7aeaecb10391dbdc20fafc5"} Nov 25 13:19:27 crc kubenswrapper[4693]: I1125 13:19:27.228040 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzch" event={"ID":"faba52c6-cbea-45df-a598-5c47cde7c00a","Type":"ContainerStarted","Data":"cd5797cdafc31ea32c9c45b7d56c093f388aad31afd254de9a63716c415b529d"} Nov 25 13:19:29 crc kubenswrapper[4693]: I1125 13:19:29.250154 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzch" event={"ID":"faba52c6-cbea-45df-a598-5c47cde7c00a","Type":"ContainerStarted","Data":"67df29950cda3bf958939eb60651bd21a1db048ea3b6bc97faeaf821e10d7cd0"} Nov 25 13:19:30 crc kubenswrapper[4693]: I1125 13:19:30.261648 4693 generic.go:334] "Generic (PLEG): container finished" podID="faba52c6-cbea-45df-a598-5c47cde7c00a" containerID="67df29950cda3bf958939eb60651bd21a1db048ea3b6bc97faeaf821e10d7cd0" exitCode=0 Nov 25 13:19:30 crc kubenswrapper[4693]: I1125 13:19:30.261695 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzch" event={"ID":"faba52c6-cbea-45df-a598-5c47cde7c00a","Type":"ContainerDied","Data":"67df29950cda3bf958939eb60651bd21a1db048ea3b6bc97faeaf821e10d7cd0"} Nov 25 13:19:32 crc kubenswrapper[4693]: I1125 13:19:32.285087 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzch" event={"ID":"faba52c6-cbea-45df-a598-5c47cde7c00a","Type":"ContainerStarted","Data":"a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526"} Nov 25 13:19:32 crc kubenswrapper[4693]: I1125 13:19:32.319272 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gkzch" podStartSLOduration=3.403906024 podStartE2EDuration="7.319249248s" podCreationTimestamp="2025-11-25 13:19:25 +0000 UTC" firstStartedPulling="2025-11-25 13:19:27.229837341 +0000 UTC m=+4287.147922712" lastFinishedPulling="2025-11-25 13:19:31.145180555 +0000 UTC m=+4291.063265936" observedRunningTime="2025-11-25 13:19:32.306157516 +0000 UTC m=+4292.224242917" watchObservedRunningTime="2025-11-25 13:19:32.319249248 +0000 UTC m=+4292.237334649" Nov 25 13:19:35 crc kubenswrapper[4693]: I1125 13:19:35.113493 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:19:35 crc kubenswrapper[4693]: I1125 13:19:35.113783 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:19:35 crc kubenswrapper[4693]: I1125 13:19:35.830761 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:35 crc kubenswrapper[4693]: I1125 13:19:35.830829 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:35 crc kubenswrapper[4693]: I1125 13:19:35.887708 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:36 crc kubenswrapper[4693]: I1125 13:19:36.389322 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:36 crc kubenswrapper[4693]: I1125 13:19:36.448353 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gkzch"] Nov 25 13:19:38 crc kubenswrapper[4693]: I1125 13:19:38.342202 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gkzch" podUID="faba52c6-cbea-45df-a598-5c47cde7c00a" containerName="registry-server" containerID="cri-o://a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526" gracePeriod=2 Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.313017 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.373025 4693 generic.go:334] "Generic (PLEG): container finished" podID="faba52c6-cbea-45df-a598-5c47cde7c00a" containerID="a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526" exitCode=0 Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.373073 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzch" event={"ID":"faba52c6-cbea-45df-a598-5c47cde7c00a","Type":"ContainerDied","Data":"a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526"} Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.373099 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzch" event={"ID":"faba52c6-cbea-45df-a598-5c47cde7c00a","Type":"ContainerDied","Data":"cd5797cdafc31ea32c9c45b7d56c093f388aad31afd254de9a63716c415b529d"} Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.373124 4693 scope.go:117] "RemoveContainer" containerID="a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.373199 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkzch" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.405784 4693 scope.go:117] "RemoveContainer" containerID="67df29950cda3bf958939eb60651bd21a1db048ea3b6bc97faeaf821e10d7cd0" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.434360 4693 scope.go:117] "RemoveContainer" containerID="4ed83c80de61ccddebf775629f24db618f18a289c7aeaecb10391dbdc20fafc5" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.459101 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhctm\" (UniqueName: \"kubernetes.io/projected/faba52c6-cbea-45df-a598-5c47cde7c00a-kube-api-access-nhctm\") pod \"faba52c6-cbea-45df-a598-5c47cde7c00a\" (UID: \"faba52c6-cbea-45df-a598-5c47cde7c00a\") " Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.459184 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faba52c6-cbea-45df-a598-5c47cde7c00a-catalog-content\") pod \"faba52c6-cbea-45df-a598-5c47cde7c00a\" (UID: \"faba52c6-cbea-45df-a598-5c47cde7c00a\") " Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.459320 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faba52c6-cbea-45df-a598-5c47cde7c00a-utilities\") pod \"faba52c6-cbea-45df-a598-5c47cde7c00a\" (UID: \"faba52c6-cbea-45df-a598-5c47cde7c00a\") " Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.460112 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faba52c6-cbea-45df-a598-5c47cde7c00a-utilities" (OuterVolumeSpecName: "utilities") pod "faba52c6-cbea-45df-a598-5c47cde7c00a" (UID: "faba52c6-cbea-45df-a598-5c47cde7c00a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.465895 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faba52c6-cbea-45df-a598-5c47cde7c00a-kube-api-access-nhctm" (OuterVolumeSpecName: "kube-api-access-nhctm") pod "faba52c6-cbea-45df-a598-5c47cde7c00a" (UID: "faba52c6-cbea-45df-a598-5c47cde7c00a"). InnerVolumeSpecName "kube-api-access-nhctm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.478029 4693 scope.go:117] "RemoveContainer" containerID="a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526" Nov 25 13:19:39 crc kubenswrapper[4693]: E1125 13:19:39.478556 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526\": container with ID starting with a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526 not found: ID does not exist" containerID="a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.478638 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526"} err="failed to get container status \"a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526\": rpc error: code = NotFound desc = could not find container \"a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526\": container with ID starting with a7ea289f14d39b8541db9909a231eda28a898ca2e29d64514a9672ce9de65526 not found: ID does not exist" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.478672 4693 scope.go:117] "RemoveContainer" containerID="67df29950cda3bf958939eb60651bd21a1db048ea3b6bc97faeaf821e10d7cd0" Nov 25 13:19:39 crc kubenswrapper[4693]: E1125 13:19:39.479035 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67df29950cda3bf958939eb60651bd21a1db048ea3b6bc97faeaf821e10d7cd0\": container with ID starting with 67df29950cda3bf958939eb60651bd21a1db048ea3b6bc97faeaf821e10d7cd0 not found: ID does not exist" containerID="67df29950cda3bf958939eb60651bd21a1db048ea3b6bc97faeaf821e10d7cd0" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.479091 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67df29950cda3bf958939eb60651bd21a1db048ea3b6bc97faeaf821e10d7cd0"} err="failed to get container status \"67df29950cda3bf958939eb60651bd21a1db048ea3b6bc97faeaf821e10d7cd0\": rpc error: code = NotFound desc = could not find container \"67df29950cda3bf958939eb60651bd21a1db048ea3b6bc97faeaf821e10d7cd0\": container with ID starting with 67df29950cda3bf958939eb60651bd21a1db048ea3b6bc97faeaf821e10d7cd0 not found: ID does not exist" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.479123 4693 scope.go:117] "RemoveContainer" containerID="4ed83c80de61ccddebf775629f24db618f18a289c7aeaecb10391dbdc20fafc5" Nov 25 13:19:39 crc kubenswrapper[4693]: E1125 13:19:39.479425 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed83c80de61ccddebf775629f24db618f18a289c7aeaecb10391dbdc20fafc5\": container with ID starting with 4ed83c80de61ccddebf775629f24db618f18a289c7aeaecb10391dbdc20fafc5 not found: ID does not exist" containerID="4ed83c80de61ccddebf775629f24db618f18a289c7aeaecb10391dbdc20fafc5" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.479461 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed83c80de61ccddebf775629f24db618f18a289c7aeaecb10391dbdc20fafc5"} err="failed to get container status \"4ed83c80de61ccddebf775629f24db618f18a289c7aeaecb10391dbdc20fafc5\": rpc error: code = NotFound desc = could not find container \"4ed83c80de61ccddebf775629f24db618f18a289c7aeaecb10391dbdc20fafc5\": container with ID starting with 4ed83c80de61ccddebf775629f24db618f18a289c7aeaecb10391dbdc20fafc5 not found: ID does not exist" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.520903 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faba52c6-cbea-45df-a598-5c47cde7c00a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "faba52c6-cbea-45df-a598-5c47cde7c00a" (UID: "faba52c6-cbea-45df-a598-5c47cde7c00a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.561416 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faba52c6-cbea-45df-a598-5c47cde7c00a-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.561452 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhctm\" (UniqueName: \"kubernetes.io/projected/faba52c6-cbea-45df-a598-5c47cde7c00a-kube-api-access-nhctm\") on node \"crc\" DevicePath \"\"" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.561464 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faba52c6-cbea-45df-a598-5c47cde7c00a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.708239 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gkzch"] Nov 25 13:19:39 crc kubenswrapper[4693]: I1125 13:19:39.717866 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gkzch"] Nov 25 13:19:40 crc kubenswrapper[4693]: I1125 13:19:40.831710 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faba52c6-cbea-45df-a598-5c47cde7c00a" path="/var/lib/kubelet/pods/faba52c6-cbea-45df-a598-5c47cde7c00a/volumes" Nov 25 13:20:05 crc kubenswrapper[4693]: I1125 13:20:05.113726 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:20:05 crc kubenswrapper[4693]: I1125 13:20:05.114196 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.765394 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c7tgm/must-gather-8gx2r"] Nov 25 13:20:07 crc kubenswrapper[4693]: E1125 13:20:07.766038 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faba52c6-cbea-45df-a598-5c47cde7c00a" containerName="registry-server" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.766050 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="faba52c6-cbea-45df-a598-5c47cde7c00a" containerName="registry-server" Nov 25 13:20:07 crc kubenswrapper[4693]: E1125 13:20:07.766092 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faba52c6-cbea-45df-a598-5c47cde7c00a" containerName="extract-utilities" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.766099 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="faba52c6-cbea-45df-a598-5c47cde7c00a" containerName="extract-utilities" Nov 25 13:20:07 crc kubenswrapper[4693]: E1125 13:20:07.766114 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faba52c6-cbea-45df-a598-5c47cde7c00a" containerName="extract-content" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.766121 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="faba52c6-cbea-45df-a598-5c47cde7c00a" containerName="extract-content" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.766321 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="faba52c6-cbea-45df-a598-5c47cde7c00a" containerName="registry-server" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.767352 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/must-gather-8gx2r" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.774179 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-c7tgm"/"kube-root-ca.crt" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.775676 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-c7tgm"/"default-dockercfg-dv92x" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.784204 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-c7tgm"/"openshift-service-ca.crt" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.794953 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-c7tgm/must-gather-8gx2r"] Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.821189 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff92b9ab-328c-44f4-8d2e-8170223db6c8-must-gather-output\") pod \"must-gather-8gx2r\" (UID: \"ff92b9ab-328c-44f4-8d2e-8170223db6c8\") " pod="openshift-must-gather-c7tgm/must-gather-8gx2r" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.821288 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfq88\" (UniqueName: \"kubernetes.io/projected/ff92b9ab-328c-44f4-8d2e-8170223db6c8-kube-api-access-qfq88\") pod \"must-gather-8gx2r\" (UID: \"ff92b9ab-328c-44f4-8d2e-8170223db6c8\") " pod="openshift-must-gather-c7tgm/must-gather-8gx2r" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.925540 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff92b9ab-328c-44f4-8d2e-8170223db6c8-must-gather-output\") pod \"must-gather-8gx2r\" (UID: \"ff92b9ab-328c-44f4-8d2e-8170223db6c8\") " pod="openshift-must-gather-c7tgm/must-gather-8gx2r" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.925774 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfq88\" (UniqueName: \"kubernetes.io/projected/ff92b9ab-328c-44f4-8d2e-8170223db6c8-kube-api-access-qfq88\") pod \"must-gather-8gx2r\" (UID: \"ff92b9ab-328c-44f4-8d2e-8170223db6c8\") " pod="openshift-must-gather-c7tgm/must-gather-8gx2r" Nov 25 13:20:07 crc kubenswrapper[4693]: I1125 13:20:07.929066 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff92b9ab-328c-44f4-8d2e-8170223db6c8-must-gather-output\") pod \"must-gather-8gx2r\" (UID: \"ff92b9ab-328c-44f4-8d2e-8170223db6c8\") " pod="openshift-must-gather-c7tgm/must-gather-8gx2r" Nov 25 13:20:08 crc kubenswrapper[4693]: I1125 13:20:08.422293 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfq88\" (UniqueName: \"kubernetes.io/projected/ff92b9ab-328c-44f4-8d2e-8170223db6c8-kube-api-access-qfq88\") pod \"must-gather-8gx2r\" (UID: \"ff92b9ab-328c-44f4-8d2e-8170223db6c8\") " pod="openshift-must-gather-c7tgm/must-gather-8gx2r" Nov 25 13:20:08 crc kubenswrapper[4693]: I1125 13:20:08.688102 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/must-gather-8gx2r" Nov 25 13:20:09 crc kubenswrapper[4693]: I1125 13:20:09.150149 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-c7tgm/must-gather-8gx2r"] Nov 25 13:20:09 crc kubenswrapper[4693]: I1125 13:20:09.660889 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c7tgm/must-gather-8gx2r" event={"ID":"ff92b9ab-328c-44f4-8d2e-8170223db6c8","Type":"ContainerStarted","Data":"f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde"} Nov 25 13:20:09 crc kubenswrapper[4693]: I1125 13:20:09.661231 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c7tgm/must-gather-8gx2r" event={"ID":"ff92b9ab-328c-44f4-8d2e-8170223db6c8","Type":"ContainerStarted","Data":"b7220073110bb8580704822194ccb455c5c1d22b1b823ffe8a1c71bf3a444e72"} Nov 25 13:20:10 crc kubenswrapper[4693]: I1125 13:20:10.672533 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c7tgm/must-gather-8gx2r" event={"ID":"ff92b9ab-328c-44f4-8d2e-8170223db6c8","Type":"ContainerStarted","Data":"2be25e83eb6d6ab243507f9ec73efe2862fb16d7af599638eb73d179844085dc"} Nov 25 13:20:10 crc kubenswrapper[4693]: I1125 13:20:10.695281 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-c7tgm/must-gather-8gx2r" podStartSLOduration=3.695258882 podStartE2EDuration="3.695258882s" podCreationTimestamp="2025-11-25 13:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 13:20:10.68512955 +0000 UTC m=+4330.603214941" watchObservedRunningTime="2025-11-25 13:20:10.695258882 +0000 UTC m=+4330.613344263" Nov 25 13:20:12 crc kubenswrapper[4693]: E1125 13:20:12.294907 4693 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.136:54474->38.102.83.136:42645: write tcp 38.102.83.136:54474->38.102.83.136:42645: write: broken pipe Nov 25 13:20:13 crc kubenswrapper[4693]: I1125 13:20:13.054024 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c7tgm/crc-debug-nns9x"] Nov 25 13:20:13 crc kubenswrapper[4693]: I1125 13:20:13.055719 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/crc-debug-nns9x" Nov 25 13:20:13 crc kubenswrapper[4693]: I1125 13:20:13.135072 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e49d261-190f-4c32-b26f-e50f5f283795-host\") pod \"crc-debug-nns9x\" (UID: \"6e49d261-190f-4c32-b26f-e50f5f283795\") " pod="openshift-must-gather-c7tgm/crc-debug-nns9x" Nov 25 13:20:13 crc kubenswrapper[4693]: I1125 13:20:13.135132 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsm2l\" (UniqueName: \"kubernetes.io/projected/6e49d261-190f-4c32-b26f-e50f5f283795-kube-api-access-fsm2l\") pod \"crc-debug-nns9x\" (UID: \"6e49d261-190f-4c32-b26f-e50f5f283795\") " pod="openshift-must-gather-c7tgm/crc-debug-nns9x" Nov 25 13:20:13 crc kubenswrapper[4693]: I1125 13:20:13.237268 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e49d261-190f-4c32-b26f-e50f5f283795-host\") pod \"crc-debug-nns9x\" (UID: \"6e49d261-190f-4c32-b26f-e50f5f283795\") " pod="openshift-must-gather-c7tgm/crc-debug-nns9x" Nov 25 13:20:13 crc kubenswrapper[4693]: I1125 13:20:13.237338 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsm2l\" (UniqueName: \"kubernetes.io/projected/6e49d261-190f-4c32-b26f-e50f5f283795-kube-api-access-fsm2l\") pod \"crc-debug-nns9x\" (UID: \"6e49d261-190f-4c32-b26f-e50f5f283795\") " pod="openshift-must-gather-c7tgm/crc-debug-nns9x" Nov 25 13:20:13 crc kubenswrapper[4693]: I1125 13:20:13.237411 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e49d261-190f-4c32-b26f-e50f5f283795-host\") pod \"crc-debug-nns9x\" (UID: \"6e49d261-190f-4c32-b26f-e50f5f283795\") " pod="openshift-must-gather-c7tgm/crc-debug-nns9x" Nov 25 13:20:13 crc kubenswrapper[4693]: I1125 13:20:13.258150 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsm2l\" (UniqueName: \"kubernetes.io/projected/6e49d261-190f-4c32-b26f-e50f5f283795-kube-api-access-fsm2l\") pod \"crc-debug-nns9x\" (UID: \"6e49d261-190f-4c32-b26f-e50f5f283795\") " pod="openshift-must-gather-c7tgm/crc-debug-nns9x" Nov 25 13:20:13 crc kubenswrapper[4693]: I1125 13:20:13.373958 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/crc-debug-nns9x" Nov 25 13:20:13 crc kubenswrapper[4693]: I1125 13:20:13.707024 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c7tgm/crc-debug-nns9x" event={"ID":"6e49d261-190f-4c32-b26f-e50f5f283795","Type":"ContainerStarted","Data":"997d3510c595bac44f22e23aa2854c34584ae153dde7110607bb8156e2b7bfeb"} Nov 25 13:20:14 crc kubenswrapper[4693]: I1125 13:20:14.716890 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c7tgm/crc-debug-nns9x" event={"ID":"6e49d261-190f-4c32-b26f-e50f5f283795","Type":"ContainerStarted","Data":"48e036e0aded4fa5aec0157f1b2d94515ba578fb80c07b41c4f2b7f7b078c537"} Nov 25 13:20:14 crc kubenswrapper[4693]: I1125 13:20:14.730607 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-c7tgm/crc-debug-nns9x" podStartSLOduration=1.7305430739999998 podStartE2EDuration="1.730543074s" podCreationTimestamp="2025-11-25 13:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-25 13:20:14.729573408 +0000 UTC m=+4334.647658799" watchObservedRunningTime="2025-11-25 13:20:14.730543074 +0000 UTC m=+4334.648628455" Nov 25 13:20:35 crc kubenswrapper[4693]: I1125 13:20:35.113902 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:20:35 crc kubenswrapper[4693]: I1125 13:20:35.114440 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:20:35 crc kubenswrapper[4693]: I1125 13:20:35.114487 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 13:20:35 crc kubenswrapper[4693]: I1125 13:20:35.115245 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 13:20:35 crc kubenswrapper[4693]: I1125 13:20:35.115308 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" gracePeriod=600 Nov 25 13:20:35 crc kubenswrapper[4693]: E1125 13:20:35.248343 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:20:35 crc kubenswrapper[4693]: I1125 13:20:35.908169 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" exitCode=0 Nov 25 13:20:35 crc kubenswrapper[4693]: I1125 13:20:35.908238 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf"} Nov 25 13:20:35 crc kubenswrapper[4693]: I1125 13:20:35.908311 4693 scope.go:117] "RemoveContainer" containerID="794dc7bc073fb7a6e7f46a652046f35088cf87ccd0ec815580db9f16fa9c5083" Nov 25 13:20:35 crc kubenswrapper[4693]: I1125 13:20:35.909167 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:20:35 crc kubenswrapper[4693]: E1125 13:20:35.909449 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:20:47 crc kubenswrapper[4693]: I1125 13:20:47.024225 4693 generic.go:334] "Generic (PLEG): container finished" podID="6e49d261-190f-4c32-b26f-e50f5f283795" containerID="48e036e0aded4fa5aec0157f1b2d94515ba578fb80c07b41c4f2b7f7b078c537" exitCode=0 Nov 25 13:20:47 crc kubenswrapper[4693]: I1125 13:20:47.024330 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c7tgm/crc-debug-nns9x" event={"ID":"6e49d261-190f-4c32-b26f-e50f5f283795","Type":"ContainerDied","Data":"48e036e0aded4fa5aec0157f1b2d94515ba578fb80c07b41c4f2b7f7b078c537"} Nov 25 13:20:48 crc kubenswrapper[4693]: I1125 13:20:48.630292 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/crc-debug-nns9x" Nov 25 13:20:48 crc kubenswrapper[4693]: I1125 13:20:48.665624 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c7tgm/crc-debug-nns9x"] Nov 25 13:20:48 crc kubenswrapper[4693]: I1125 13:20:48.676965 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c7tgm/crc-debug-nns9x"] Nov 25 13:20:48 crc kubenswrapper[4693]: I1125 13:20:48.805542 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e49d261-190f-4c32-b26f-e50f5f283795-host\") pod \"6e49d261-190f-4c32-b26f-e50f5f283795\" (UID: \"6e49d261-190f-4c32-b26f-e50f5f283795\") " Nov 25 13:20:48 crc kubenswrapper[4693]: I1125 13:20:48.805848 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsm2l\" (UniqueName: \"kubernetes.io/projected/6e49d261-190f-4c32-b26f-e50f5f283795-kube-api-access-fsm2l\") pod \"6e49d261-190f-4c32-b26f-e50f5f283795\" (UID: \"6e49d261-190f-4c32-b26f-e50f5f283795\") " Nov 25 13:20:48 crc kubenswrapper[4693]: I1125 13:20:48.805653 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e49d261-190f-4c32-b26f-e50f5f283795-host" (OuterVolumeSpecName: "host") pod "6e49d261-190f-4c32-b26f-e50f5f283795" (UID: "6e49d261-190f-4c32-b26f-e50f5f283795"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 13:20:48 crc kubenswrapper[4693]: I1125 13:20:48.806404 4693 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e49d261-190f-4c32-b26f-e50f5f283795-host\") on node \"crc\" DevicePath \"\"" Nov 25 13:20:48 crc kubenswrapper[4693]: I1125 13:20:48.811203 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e49d261-190f-4c32-b26f-e50f5f283795-kube-api-access-fsm2l" (OuterVolumeSpecName: "kube-api-access-fsm2l") pod "6e49d261-190f-4c32-b26f-e50f5f283795" (UID: "6e49d261-190f-4c32-b26f-e50f5f283795"). InnerVolumeSpecName "kube-api-access-fsm2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:20:48 crc kubenswrapper[4693]: I1125 13:20:48.813258 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:20:48 crc kubenswrapper[4693]: E1125 13:20:48.813731 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:20:48 crc kubenswrapper[4693]: I1125 13:20:48.826305 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e49d261-190f-4c32-b26f-e50f5f283795" path="/var/lib/kubelet/pods/6e49d261-190f-4c32-b26f-e50f5f283795/volumes" Nov 25 13:20:48 crc kubenswrapper[4693]: I1125 13:20:48.908263 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsm2l\" (UniqueName: \"kubernetes.io/projected/6e49d261-190f-4c32-b26f-e50f5f283795-kube-api-access-fsm2l\") on node \"crc\" DevicePath \"\"" Nov 25 13:20:49 crc kubenswrapper[4693]: I1125 13:20:49.052948 4693 scope.go:117] "RemoveContainer" containerID="48e036e0aded4fa5aec0157f1b2d94515ba578fb80c07b41c4f2b7f7b078c537" Nov 25 13:20:49 crc kubenswrapper[4693]: I1125 13:20:49.053040 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/crc-debug-nns9x" Nov 25 13:20:49 crc kubenswrapper[4693]: I1125 13:20:49.865889 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c7tgm/crc-debug-86ffr"] Nov 25 13:20:49 crc kubenswrapper[4693]: E1125 13:20:49.866984 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e49d261-190f-4c32-b26f-e50f5f283795" containerName="container-00" Nov 25 13:20:49 crc kubenswrapper[4693]: I1125 13:20:49.867030 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e49d261-190f-4c32-b26f-e50f5f283795" containerName="container-00" Nov 25 13:20:49 crc kubenswrapper[4693]: I1125 13:20:49.867306 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e49d261-190f-4c32-b26f-e50f5f283795" containerName="container-00" Nov 25 13:20:49 crc kubenswrapper[4693]: I1125 13:20:49.868235 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/crc-debug-86ffr" Nov 25 13:20:50 crc kubenswrapper[4693]: I1125 13:20:50.028990 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlgqf\" (UniqueName: \"kubernetes.io/projected/59f626d7-92ec-421d-8ef1-48370ddaf77f-kube-api-access-nlgqf\") pod \"crc-debug-86ffr\" (UID: \"59f626d7-92ec-421d-8ef1-48370ddaf77f\") " pod="openshift-must-gather-c7tgm/crc-debug-86ffr" Nov 25 13:20:50 crc kubenswrapper[4693]: I1125 13:20:50.029173 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59f626d7-92ec-421d-8ef1-48370ddaf77f-host\") pod \"crc-debug-86ffr\" (UID: \"59f626d7-92ec-421d-8ef1-48370ddaf77f\") " pod="openshift-must-gather-c7tgm/crc-debug-86ffr" Nov 25 13:20:50 crc kubenswrapper[4693]: I1125 13:20:50.130673 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59f626d7-92ec-421d-8ef1-48370ddaf77f-host\") pod \"crc-debug-86ffr\" (UID: \"59f626d7-92ec-421d-8ef1-48370ddaf77f\") " pod="openshift-must-gather-c7tgm/crc-debug-86ffr" Nov 25 13:20:50 crc kubenswrapper[4693]: I1125 13:20:50.130805 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59f626d7-92ec-421d-8ef1-48370ddaf77f-host\") pod \"crc-debug-86ffr\" (UID: \"59f626d7-92ec-421d-8ef1-48370ddaf77f\") " pod="openshift-must-gather-c7tgm/crc-debug-86ffr" Nov 25 13:20:50 crc kubenswrapper[4693]: I1125 13:20:50.131203 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlgqf\" (UniqueName: \"kubernetes.io/projected/59f626d7-92ec-421d-8ef1-48370ddaf77f-kube-api-access-nlgqf\") pod \"crc-debug-86ffr\" (UID: \"59f626d7-92ec-421d-8ef1-48370ddaf77f\") " pod="openshift-must-gather-c7tgm/crc-debug-86ffr" Nov 25 13:20:50 crc kubenswrapper[4693]: I1125 13:20:50.157581 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlgqf\" (UniqueName: \"kubernetes.io/projected/59f626d7-92ec-421d-8ef1-48370ddaf77f-kube-api-access-nlgqf\") pod \"crc-debug-86ffr\" (UID: \"59f626d7-92ec-421d-8ef1-48370ddaf77f\") " pod="openshift-must-gather-c7tgm/crc-debug-86ffr" Nov 25 13:20:50 crc kubenswrapper[4693]: I1125 13:20:50.191814 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/crc-debug-86ffr" Nov 25 13:20:51 crc kubenswrapper[4693]: I1125 13:20:51.075185 4693 generic.go:334] "Generic (PLEG): container finished" podID="59f626d7-92ec-421d-8ef1-48370ddaf77f" containerID="f509d08a00ca014dc42ce1d0223eb645ffffd506c8752d758acb480913796044" exitCode=0 Nov 25 13:20:51 crc kubenswrapper[4693]: I1125 13:20:51.075268 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c7tgm/crc-debug-86ffr" event={"ID":"59f626d7-92ec-421d-8ef1-48370ddaf77f","Type":"ContainerDied","Data":"f509d08a00ca014dc42ce1d0223eb645ffffd506c8752d758acb480913796044"} Nov 25 13:20:51 crc kubenswrapper[4693]: I1125 13:20:51.075706 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c7tgm/crc-debug-86ffr" event={"ID":"59f626d7-92ec-421d-8ef1-48370ddaf77f","Type":"ContainerStarted","Data":"8a9e7bc941a39b42d6bf8add404cf4b2ab6cbc2d825c30ec5f28f265c68f5529"} Nov 25 13:20:51 crc kubenswrapper[4693]: I1125 13:20:51.639832 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c7tgm/crc-debug-86ffr"] Nov 25 13:20:51 crc kubenswrapper[4693]: I1125 13:20:51.649498 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c7tgm/crc-debug-86ffr"] Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.178137 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/crc-debug-86ffr" Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.375221 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlgqf\" (UniqueName: \"kubernetes.io/projected/59f626d7-92ec-421d-8ef1-48370ddaf77f-kube-api-access-nlgqf\") pod \"59f626d7-92ec-421d-8ef1-48370ddaf77f\" (UID: \"59f626d7-92ec-421d-8ef1-48370ddaf77f\") " Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.375461 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59f626d7-92ec-421d-8ef1-48370ddaf77f-host\") pod \"59f626d7-92ec-421d-8ef1-48370ddaf77f\" (UID: \"59f626d7-92ec-421d-8ef1-48370ddaf77f\") " Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.375858 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/59f626d7-92ec-421d-8ef1-48370ddaf77f-host" (OuterVolumeSpecName: "host") pod "59f626d7-92ec-421d-8ef1-48370ddaf77f" (UID: "59f626d7-92ec-421d-8ef1-48370ddaf77f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.384144 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f626d7-92ec-421d-8ef1-48370ddaf77f-kube-api-access-nlgqf" (OuterVolumeSpecName: "kube-api-access-nlgqf") pod "59f626d7-92ec-421d-8ef1-48370ddaf77f" (UID: "59f626d7-92ec-421d-8ef1-48370ddaf77f"). InnerVolumeSpecName "kube-api-access-nlgqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.477346 4693 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59f626d7-92ec-421d-8ef1-48370ddaf77f-host\") on node \"crc\" DevicePath \"\"" Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.477399 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlgqf\" (UniqueName: \"kubernetes.io/projected/59f626d7-92ec-421d-8ef1-48370ddaf77f-kube-api-access-nlgqf\") on node \"crc\" DevicePath \"\"" Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.798465 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-c7tgm/crc-debug-9kkqq"] Nov 25 13:20:52 crc kubenswrapper[4693]: E1125 13:20:52.798835 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f626d7-92ec-421d-8ef1-48370ddaf77f" containerName="container-00" Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.798851 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f626d7-92ec-421d-8ef1-48370ddaf77f" containerName="container-00" Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.799104 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f626d7-92ec-421d-8ef1-48370ddaf77f" containerName="container-00" Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.799719 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/crc-debug-9kkqq" Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.825020 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f626d7-92ec-421d-8ef1-48370ddaf77f" path="/var/lib/kubelet/pods/59f626d7-92ec-421d-8ef1-48370ddaf77f/volumes" Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.986088 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f833915-8d05-4dca-819b-82ef25837c7c-host\") pod \"crc-debug-9kkqq\" (UID: \"4f833915-8d05-4dca-819b-82ef25837c7c\") " pod="openshift-must-gather-c7tgm/crc-debug-9kkqq" Nov 25 13:20:52 crc kubenswrapper[4693]: I1125 13:20:52.986516 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv2bh\" (UniqueName: \"kubernetes.io/projected/4f833915-8d05-4dca-819b-82ef25837c7c-kube-api-access-dv2bh\") pod \"crc-debug-9kkqq\" (UID: \"4f833915-8d05-4dca-819b-82ef25837c7c\") " pod="openshift-must-gather-c7tgm/crc-debug-9kkqq" Nov 25 13:20:53 crc kubenswrapper[4693]: I1125 13:20:53.089882 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f833915-8d05-4dca-819b-82ef25837c7c-host\") pod \"crc-debug-9kkqq\" (UID: \"4f833915-8d05-4dca-819b-82ef25837c7c\") " pod="openshift-must-gather-c7tgm/crc-debug-9kkqq" Nov 25 13:20:53 crc kubenswrapper[4693]: I1125 13:20:53.089985 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f833915-8d05-4dca-819b-82ef25837c7c-host\") pod \"crc-debug-9kkqq\" (UID: \"4f833915-8d05-4dca-819b-82ef25837c7c\") " pod="openshift-must-gather-c7tgm/crc-debug-9kkqq" Nov 25 13:20:53 crc kubenswrapper[4693]: I1125 13:20:53.090028 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv2bh\" (UniqueName: \"kubernetes.io/projected/4f833915-8d05-4dca-819b-82ef25837c7c-kube-api-access-dv2bh\") pod \"crc-debug-9kkqq\" (UID: \"4f833915-8d05-4dca-819b-82ef25837c7c\") " pod="openshift-must-gather-c7tgm/crc-debug-9kkqq" Nov 25 13:20:53 crc kubenswrapper[4693]: I1125 13:20:53.099465 4693 scope.go:117] "RemoveContainer" containerID="f509d08a00ca014dc42ce1d0223eb645ffffd506c8752d758acb480913796044" Nov 25 13:20:53 crc kubenswrapper[4693]: I1125 13:20:53.099475 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/crc-debug-86ffr" Nov 25 13:20:53 crc kubenswrapper[4693]: I1125 13:20:53.107571 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv2bh\" (UniqueName: \"kubernetes.io/projected/4f833915-8d05-4dca-819b-82ef25837c7c-kube-api-access-dv2bh\") pod \"crc-debug-9kkqq\" (UID: \"4f833915-8d05-4dca-819b-82ef25837c7c\") " pod="openshift-must-gather-c7tgm/crc-debug-9kkqq" Nov 25 13:20:53 crc kubenswrapper[4693]: I1125 13:20:53.115222 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/crc-debug-9kkqq" Nov 25 13:20:53 crc kubenswrapper[4693]: W1125 13:20:53.173173 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f833915_8d05_4dca_819b_82ef25837c7c.slice/crio-cfbbb1a1a3f3dfe59bd97f02f0a1fea5597cf867ac07e077ad137b4f01520d38 WatchSource:0}: Error finding container cfbbb1a1a3f3dfe59bd97f02f0a1fea5597cf867ac07e077ad137b4f01520d38: Status 404 returned error can't find the container with id cfbbb1a1a3f3dfe59bd97f02f0a1fea5597cf867ac07e077ad137b4f01520d38 Nov 25 13:20:54 crc kubenswrapper[4693]: I1125 13:20:54.113784 4693 generic.go:334] "Generic (PLEG): container finished" podID="4f833915-8d05-4dca-819b-82ef25837c7c" containerID="02400d07f3272b153ac43c47f657b952d05c72e5af4388da4b5beca2ca0671ef" exitCode=0 Nov 25 13:20:54 crc kubenswrapper[4693]: I1125 13:20:54.114026 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c7tgm/crc-debug-9kkqq" event={"ID":"4f833915-8d05-4dca-819b-82ef25837c7c","Type":"ContainerDied","Data":"02400d07f3272b153ac43c47f657b952d05c72e5af4388da4b5beca2ca0671ef"} Nov 25 13:20:54 crc kubenswrapper[4693]: I1125 13:20:54.114112 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c7tgm/crc-debug-9kkqq" event={"ID":"4f833915-8d05-4dca-819b-82ef25837c7c","Type":"ContainerStarted","Data":"cfbbb1a1a3f3dfe59bd97f02f0a1fea5597cf867ac07e077ad137b4f01520d38"} Nov 25 13:20:54 crc kubenswrapper[4693]: I1125 13:20:54.156532 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c7tgm/crc-debug-9kkqq"] Nov 25 13:20:54 crc kubenswrapper[4693]: I1125 13:20:54.168802 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c7tgm/crc-debug-9kkqq"] Nov 25 13:20:55 crc kubenswrapper[4693]: I1125 13:20:55.229130 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/crc-debug-9kkqq" Nov 25 13:20:55 crc kubenswrapper[4693]: I1125 13:20:55.328845 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f833915-8d05-4dca-819b-82ef25837c7c-host\") pod \"4f833915-8d05-4dca-819b-82ef25837c7c\" (UID: \"4f833915-8d05-4dca-819b-82ef25837c7c\") " Nov 25 13:20:55 crc kubenswrapper[4693]: I1125 13:20:55.329177 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv2bh\" (UniqueName: \"kubernetes.io/projected/4f833915-8d05-4dca-819b-82ef25837c7c-kube-api-access-dv2bh\") pod \"4f833915-8d05-4dca-819b-82ef25837c7c\" (UID: \"4f833915-8d05-4dca-819b-82ef25837c7c\") " Nov 25 13:20:55 crc kubenswrapper[4693]: I1125 13:20:55.328949 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f833915-8d05-4dca-819b-82ef25837c7c-host" (OuterVolumeSpecName: "host") pod "4f833915-8d05-4dca-819b-82ef25837c7c" (UID: "4f833915-8d05-4dca-819b-82ef25837c7c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 25 13:20:55 crc kubenswrapper[4693]: I1125 13:20:55.329679 4693 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f833915-8d05-4dca-819b-82ef25837c7c-host\") on node \"crc\" DevicePath \"\"" Nov 25 13:20:55 crc kubenswrapper[4693]: I1125 13:20:55.334429 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f833915-8d05-4dca-819b-82ef25837c7c-kube-api-access-dv2bh" (OuterVolumeSpecName: "kube-api-access-dv2bh") pod "4f833915-8d05-4dca-819b-82ef25837c7c" (UID: "4f833915-8d05-4dca-819b-82ef25837c7c"). InnerVolumeSpecName "kube-api-access-dv2bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:20:55 crc kubenswrapper[4693]: I1125 13:20:55.431453 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv2bh\" (UniqueName: \"kubernetes.io/projected/4f833915-8d05-4dca-819b-82ef25837c7c-kube-api-access-dv2bh\") on node \"crc\" DevicePath \"\"" Nov 25 13:20:56 crc kubenswrapper[4693]: I1125 13:20:56.130751 4693 scope.go:117] "RemoveContainer" containerID="02400d07f3272b153ac43c47f657b952d05c72e5af4388da4b5beca2ca0671ef" Nov 25 13:20:56 crc kubenswrapper[4693]: I1125 13:20:56.130785 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/crc-debug-9kkqq" Nov 25 13:20:56 crc kubenswrapper[4693]: I1125 13:20:56.824111 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f833915-8d05-4dca-819b-82ef25837c7c" path="/var/lib/kubelet/pods/4f833915-8d05-4dca-819b-82ef25837c7c/volumes" Nov 25 13:21:01 crc kubenswrapper[4693]: I1125 13:21:01.812675 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:21:01 crc kubenswrapper[4693]: E1125 13:21:01.813382 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:21:12 crc kubenswrapper[4693]: I1125 13:21:12.813516 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:21:12 crc kubenswrapper[4693]: E1125 13:21:12.814253 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:21:20 crc kubenswrapper[4693]: I1125 13:21:20.482482 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5c49f9f854-clbv4_0c770547-1b83-43a7-ac47-82226ce02958/barbican-api/0.log" Nov 25 13:21:20 crc kubenswrapper[4693]: I1125 13:21:20.620913 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5c49f9f854-clbv4_0c770547-1b83-43a7-ac47-82226ce02958/barbican-api-log/0.log" Nov 25 13:21:20 crc kubenswrapper[4693]: I1125 13:21:20.747907 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d949d856b-fbdcg_c743d467-4bdb-41ce-bf74-5051a93fc3d6/barbican-keystone-listener/0.log" Nov 25 13:21:20 crc kubenswrapper[4693]: I1125 13:21:20.756002 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d949d856b-fbdcg_c743d467-4bdb-41ce-bf74-5051a93fc3d6/barbican-keystone-listener-log/0.log" Nov 25 13:21:20 crc kubenswrapper[4693]: I1125 13:21:20.970744 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d9fc5569-wdqkp_084144ce-d043-4dd8-bc4b-e904c42e47cd/barbican-worker/0.log" Nov 25 13:21:20 crc kubenswrapper[4693]: I1125 13:21:20.985135 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7d9fc5569-wdqkp_084144ce-d043-4dd8-bc4b-e904c42e47cd/barbican-worker-log/0.log" Nov 25 13:21:21 crc kubenswrapper[4693]: I1125 13:21:21.152356 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-46z97_0c125840-c37c-445e-95d9-37c74703ea85/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:21 crc kubenswrapper[4693]: I1125 13:21:21.264048 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2e6a63b2-2650-4dc4-9a37-d7be65342b5d/ceilometer-central-agent/0.log" Nov 25 13:21:21 crc kubenswrapper[4693]: I1125 13:21:21.343011 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2e6a63b2-2650-4dc4-9a37-d7be65342b5d/ceilometer-notification-agent/0.log" Nov 25 13:21:21 crc kubenswrapper[4693]: I1125 13:21:21.512129 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2e6a63b2-2650-4dc4-9a37-d7be65342b5d/proxy-httpd/0.log" Nov 25 13:21:21 crc kubenswrapper[4693]: I1125 13:21:21.649169 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2e6a63b2-2650-4dc4-9a37-d7be65342b5d/sg-core/0.log" Nov 25 13:21:21 crc kubenswrapper[4693]: I1125 13:21:21.790035 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8aeba5db-6f5b-4714-9d46-5db9b9058cb6/cinder-api/0.log" Nov 25 13:21:21 crc kubenswrapper[4693]: I1125 13:21:21.868741 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8aeba5db-6f5b-4714-9d46-5db9b9058cb6/cinder-api-log/0.log" Nov 25 13:21:22 crc kubenswrapper[4693]: I1125 13:21:22.097622 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_42d5e91d-841b-453a-a5db-f2d1bf40fbec/probe/0.log" Nov 25 13:21:22 crc kubenswrapper[4693]: I1125 13:21:22.100310 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_42d5e91d-841b-453a-a5db-f2d1bf40fbec/cinder-scheduler/0.log" Nov 25 13:21:22 crc kubenswrapper[4693]: I1125 13:21:22.196551 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-qfskr_216fd77e-1bfd-4c99-8fd8-2711d9de6beb/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:22 crc kubenswrapper[4693]: I1125 13:21:22.285588 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-4m8wd_9667d434-5214-4754-baea-bcc266b58358/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:22 crc kubenswrapper[4693]: I1125 13:21:22.413272 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d5cf5b645-f87gr_fcc8f52b-776d-4a49-b62f-bf73fcc35fe0/init/0.log" Nov 25 13:21:22 crc kubenswrapper[4693]: I1125 13:21:22.630615 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d5cf5b645-f87gr_fcc8f52b-776d-4a49-b62f-bf73fcc35fe0/init/0.log" Nov 25 13:21:22 crc kubenswrapper[4693]: I1125 13:21:22.651022 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2jqz2_81f5f268-3ead-442b-ae8f-d7e2c11a6752/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:22 crc kubenswrapper[4693]: I1125 13:21:22.694762 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5d5cf5b645-f87gr_fcc8f52b-776d-4a49-b62f-bf73fcc35fe0/dnsmasq-dns/0.log" Nov 25 13:21:22 crc kubenswrapper[4693]: I1125 13:21:22.842422 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bb2f0f2d-5d66-485f-a389-e07c52f143f2/glance-httpd/0.log" Nov 25 13:21:22 crc kubenswrapper[4693]: I1125 13:21:22.878323 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bb2f0f2d-5d66-485f-a389-e07c52f143f2/glance-log/0.log" Nov 25 13:21:23 crc kubenswrapper[4693]: I1125 13:21:23.101359 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_711404f8-4ff3-44b1-b4f5-dfdc70ac930f/glance-httpd/0.log" Nov 25 13:21:23 crc kubenswrapper[4693]: I1125 13:21:23.121302 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_711404f8-4ff3-44b1-b4f5-dfdc70ac930f/glance-log/0.log" Nov 25 13:21:23 crc kubenswrapper[4693]: I1125 13:21:23.314203 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-574fd6fdfd-bz6sm_14ff5a36-1912-43a8-b87f-57a6858a5799/horizon/0.log" Nov 25 13:21:23 crc kubenswrapper[4693]: I1125 13:21:23.455818 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nrrmr_6e1fcb75-e8ac-49a2-bfd6-3607ea04ac4f/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:23 crc kubenswrapper[4693]: I1125 13:21:23.671971 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-kvl2g_c0c79f30-0e24-4101-8632-19de1642f7e2/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:23 crc kubenswrapper[4693]: I1125 13:21:23.827683 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-574fd6fdfd-bz6sm_14ff5a36-1912-43a8-b87f-57a6858a5799/horizon-log/0.log" Nov 25 13:21:23 crc kubenswrapper[4693]: I1125 13:21:23.984807 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29401261-7sn2t_3306a30d-dcff-4460-81f2-3561573e57a2/keystone-cron/0.log" Nov 25 13:21:24 crc kubenswrapper[4693]: I1125 13:21:24.098482 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7888468d67-2bztz_e3c87b9d-25f9-445f-be14-b43f1cb887a4/keystone-api/0.log" Nov 25 13:21:24 crc kubenswrapper[4693]: I1125 13:21:24.495881 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ee5b4281-3cdb-4bad-8002-8520136232a4/kube-state-metrics/2.log" Nov 25 13:21:24 crc kubenswrapper[4693]: I1125 13:21:24.568973 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ee5b4281-3cdb-4bad-8002-8520136232a4/kube-state-metrics/3.log" Nov 25 13:21:24 crc kubenswrapper[4693]: I1125 13:21:24.597257 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-njx62_df6120d2-3571-4059-8fb1-d40741960cff/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:24 crc kubenswrapper[4693]: I1125 13:21:24.948278 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57c497f557-r9sp7_66ccc10f-a153-4582-ab8d-f687b0c6bb20/neutron-httpd/0.log" Nov 25 13:21:24 crc kubenswrapper[4693]: I1125 13:21:24.957151 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57c497f557-r9sp7_66ccc10f-a153-4582-ab8d-f687b0c6bb20/neutron-api/0.log" Nov 25 13:21:25 crc kubenswrapper[4693]: I1125 13:21:25.020989 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-8z6kz_3e1334ad-6a95-4d72-95a2-5dfa8d78e530/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:25 crc kubenswrapper[4693]: I1125 13:21:25.853028 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_37605f96-b59e-45ad-b177-dad562d6af05/nova-api-api/0.log" Nov 25 13:21:26 crc kubenswrapper[4693]: I1125 13:21:26.233606 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bdc79fdf-d996-42ba-b250-2501738ed0bc/nova-cell0-conductor-conductor/0.log" Nov 25 13:21:26 crc kubenswrapper[4693]: I1125 13:21:26.265923 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_37605f96-b59e-45ad-b177-dad562d6af05/nova-api-log/0.log" Nov 25 13:21:26 crc kubenswrapper[4693]: I1125 13:21:26.386219 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_35d79906-6ec5-4483-83ef-ae2ff2674c86/nova-cell1-conductor-conductor/0.log" Nov 25 13:21:26 crc kubenswrapper[4693]: I1125 13:21:26.625073 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-q4ddm_2a152944-4c08-47ee-bc41-90fa01d90bb1/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:26 crc kubenswrapper[4693]: I1125 13:21:26.631022 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f2e39ed4-ac1c-4961-80a1-24b93bed8f4b/nova-cell1-novncproxy-novncproxy/0.log" Nov 25 13:21:26 crc kubenswrapper[4693]: I1125 13:21:26.813226 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:21:26 crc kubenswrapper[4693]: E1125 13:21:26.813523 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:21:26 crc kubenswrapper[4693]: I1125 13:21:26.878818 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_599f11ff-5079-4815-be45-5ffac410eb82/nova-metadata-log/0.log" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.194998 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_4eca8fd3-dd93-493a-9278-1749de83eae1/nova-scheduler-scheduler/0.log" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.225502 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9fc3b8be-d4cc-4bb4-86f0-5516294c1221/mysql-bootstrap/0.log" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.749515 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v45fj"] Nov 25 13:21:27 crc kubenswrapper[4693]: E1125 13:21:27.750111 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f833915-8d05-4dca-819b-82ef25837c7c" containerName="container-00" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.750131 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f833915-8d05-4dca-819b-82ef25837c7c" containerName="container-00" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.750416 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f833915-8d05-4dca-819b-82ef25837c7c" containerName="container-00" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.752153 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.767304 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v45fj"] Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.853977 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ca60ae-bb88-4d40-aa72-d228efce65fd-utilities\") pod \"redhat-operators-v45fj\" (UID: \"95ca60ae-bb88-4d40-aa72-d228efce65fd\") " pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.854530 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnd99\" (UniqueName: \"kubernetes.io/projected/95ca60ae-bb88-4d40-aa72-d228efce65fd-kube-api-access-cnd99\") pod \"redhat-operators-v45fj\" (UID: \"95ca60ae-bb88-4d40-aa72-d228efce65fd\") " pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.854652 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ca60ae-bb88-4d40-aa72-d228efce65fd-catalog-content\") pod \"redhat-operators-v45fj\" (UID: \"95ca60ae-bb88-4d40-aa72-d228efce65fd\") " pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.942798 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9fc3b8be-d4cc-4bb4-86f0-5516294c1221/mysql-bootstrap/0.log" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.964024 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnd99\" (UniqueName: \"kubernetes.io/projected/95ca60ae-bb88-4d40-aa72-d228efce65fd-kube-api-access-cnd99\") pod \"redhat-operators-v45fj\" (UID: \"95ca60ae-bb88-4d40-aa72-d228efce65fd\") " pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.964092 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ca60ae-bb88-4d40-aa72-d228efce65fd-catalog-content\") pod \"redhat-operators-v45fj\" (UID: \"95ca60ae-bb88-4d40-aa72-d228efce65fd\") " pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.964173 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ca60ae-bb88-4d40-aa72-d228efce65fd-utilities\") pod \"redhat-operators-v45fj\" (UID: \"95ca60ae-bb88-4d40-aa72-d228efce65fd\") " pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.964656 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ca60ae-bb88-4d40-aa72-d228efce65fd-utilities\") pod \"redhat-operators-v45fj\" (UID: \"95ca60ae-bb88-4d40-aa72-d228efce65fd\") " pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.965277 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ca60ae-bb88-4d40-aa72-d228efce65fd-catalog-content\") pod \"redhat-operators-v45fj\" (UID: \"95ca60ae-bb88-4d40-aa72-d228efce65fd\") " pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:27 crc kubenswrapper[4693]: I1125 13:21:27.990650 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnd99\" (UniqueName: \"kubernetes.io/projected/95ca60ae-bb88-4d40-aa72-d228efce65fd-kube-api-access-cnd99\") pod \"redhat-operators-v45fj\" (UID: \"95ca60ae-bb88-4d40-aa72-d228efce65fd\") " pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:28 crc kubenswrapper[4693]: I1125 13:21:28.086632 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:28 crc kubenswrapper[4693]: I1125 13:21:28.092628 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_9fc3b8be-d4cc-4bb4-86f0-5516294c1221/galera/0.log" Nov 25 13:21:28 crc kubenswrapper[4693]: I1125 13:21:28.146260 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4cf2be5d-1c6c-402f-bf93-e9653a6a84cd/mysql-bootstrap/0.log" Nov 25 13:21:28 crc kubenswrapper[4693]: I1125 13:21:28.453229 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4cf2be5d-1c6c-402f-bf93-e9653a6a84cd/mysql-bootstrap/0.log" Nov 25 13:21:28 crc kubenswrapper[4693]: I1125 13:21:28.595750 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4cf2be5d-1c6c-402f-bf93-e9653a6a84cd/galera/0.log" Nov 25 13:21:28 crc kubenswrapper[4693]: I1125 13:21:28.653219 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v45fj"] Nov 25 13:21:28 crc kubenswrapper[4693]: I1125 13:21:28.684707 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_599f11ff-5079-4815-be45-5ffac410eb82/nova-metadata-metadata/0.log" Nov 25 13:21:28 crc kubenswrapper[4693]: I1125 13:21:28.755397 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d51f97b0-16ac-43b8-aa77-b2a66faef2cd/openstackclient/0.log" Nov 25 13:21:28 crc kubenswrapper[4693]: I1125 13:21:28.824816 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-szpg5_266234f1-8683-4a0d-a1ec-42cd82184f11/openstack-network-exporter/0.log" Nov 25 13:21:29 crc kubenswrapper[4693]: I1125 13:21:29.105866 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ndgsx_96236f54-53d2-47df-854b-51addeda1dee/ovn-controller/0.log" Nov 25 13:21:29 crc kubenswrapper[4693]: I1125 13:21:29.150828 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8vhnn_93d2601b-fc82-478d-8667-dbce77606f4d/ovsdb-server-init/0.log" Nov 25 13:21:29 crc kubenswrapper[4693]: I1125 13:21:29.375390 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8vhnn_93d2601b-fc82-478d-8667-dbce77606f4d/ovs-vswitchd/0.log" Nov 25 13:21:29 crc kubenswrapper[4693]: I1125 13:21:29.400288 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8vhnn_93d2601b-fc82-478d-8667-dbce77606f4d/ovsdb-server-init/0.log" Nov 25 13:21:29 crc kubenswrapper[4693]: I1125 13:21:29.473450 4693 generic.go:334] "Generic (PLEG): container finished" podID="95ca60ae-bb88-4d40-aa72-d228efce65fd" containerID="1a73343521fe9c9f2d74bb8966561407fcda750c884bc63084cf2d794144c648" exitCode=0 Nov 25 13:21:29 crc kubenswrapper[4693]: I1125 13:21:29.473500 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v45fj" event={"ID":"95ca60ae-bb88-4d40-aa72-d228efce65fd","Type":"ContainerDied","Data":"1a73343521fe9c9f2d74bb8966561407fcda750c884bc63084cf2d794144c648"} Nov 25 13:21:29 crc kubenswrapper[4693]: I1125 13:21:29.473525 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v45fj" event={"ID":"95ca60ae-bb88-4d40-aa72-d228efce65fd","Type":"ContainerStarted","Data":"0fcbf6ed05185b738ad444c6837cea2ee38dae97461126b2ec43a1a939480a1b"} Nov 25 13:21:29 crc kubenswrapper[4693]: I1125 13:21:29.609124 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8vhnn_93d2601b-fc82-478d-8667-dbce77606f4d/ovsdb-server/0.log" Nov 25 13:21:29 crc kubenswrapper[4693]: I1125 13:21:29.951194 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7ct5k_d7282171-b6bf-44b4-a5a3-f60d6d5baa5f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:29 crc kubenswrapper[4693]: I1125 13:21:29.959413 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5de2ed94-055d-4e4b-b069-3bcafd88cc3f/openstack-network-exporter/0.log" Nov 25 13:21:29 crc kubenswrapper[4693]: I1125 13:21:29.976293 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5de2ed94-055d-4e4b-b069-3bcafd88cc3f/ovn-northd/0.log" Nov 25 13:21:30 crc kubenswrapper[4693]: I1125 13:21:30.161313 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4d617274-42b9-4d07-b321-d70a5aeba8ee/openstack-network-exporter/0.log" Nov 25 13:21:30 crc kubenswrapper[4693]: I1125 13:21:30.169677 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4d617274-42b9-4d07-b321-d70a5aeba8ee/ovsdbserver-nb/0.log" Nov 25 13:21:30 crc kubenswrapper[4693]: I1125 13:21:30.278206 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_89a481b1-6040-4f15-a63f-d6d2301c3534/openstack-network-exporter/0.log" Nov 25 13:21:30 crc kubenswrapper[4693]: I1125 13:21:30.485541 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v45fj" event={"ID":"95ca60ae-bb88-4d40-aa72-d228efce65fd","Type":"ContainerStarted","Data":"7e06c28b49134181434d5a54da3bac316891518ee13358c848070617d228b882"} Nov 25 13:21:30 crc kubenswrapper[4693]: I1125 13:21:30.653392 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_89a481b1-6040-4f15-a63f-d6d2301c3534/ovsdbserver-sb/0.log" Nov 25 13:21:30 crc kubenswrapper[4693]: I1125 13:21:30.718072 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7bf98548b6-68m92_d7f68eff-0e73-43ec-bb9a-97fd321b92ec/placement-api/0.log" Nov 25 13:21:30 crc kubenswrapper[4693]: I1125 13:21:30.976518 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7bf98548b6-68m92_d7f68eff-0e73-43ec-bb9a-97fd321b92ec/placement-log/0.log" Nov 25 13:21:31 crc kubenswrapper[4693]: I1125 13:21:31.137598 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5c73e56b-c0f3-4d6d-9e33-26fe0d552e24/setup-container/0.log" Nov 25 13:21:31 crc kubenswrapper[4693]: I1125 13:21:31.493788 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_98b0bc68-9551-407d-8390-66688e8255d3/setup-container/0.log" Nov 25 13:21:31 crc kubenswrapper[4693]: I1125 13:21:31.507226 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5c73e56b-c0f3-4d6d-9e33-26fe0d552e24/rabbitmq/0.log" Nov 25 13:21:31 crc kubenswrapper[4693]: I1125 13:21:31.528666 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5c73e56b-c0f3-4d6d-9e33-26fe0d552e24/setup-container/0.log" Nov 25 13:21:31 crc kubenswrapper[4693]: I1125 13:21:31.794039 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_98b0bc68-9551-407d-8390-66688e8255d3/setup-container/0.log" Nov 25 13:21:31 crc kubenswrapper[4693]: I1125 13:21:31.934569 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_98b0bc68-9551-407d-8390-66688e8255d3/rabbitmq/0.log" Nov 25 13:21:31 crc kubenswrapper[4693]: I1125 13:21:31.938195 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-v44l7_622af4c3-4b56-4b3c-8ea2-6d30432a706a/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:32 crc kubenswrapper[4693]: I1125 13:21:32.181077 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-65zzc_70ae1b8a-3af0-4d98-a633-6933a83b2b71/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:32 crc kubenswrapper[4693]: I1125 13:21:32.220857 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-7bb4k_3f8577c4-f507-4e40-b284-66d57b0aee3d/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:32 crc kubenswrapper[4693]: I1125 13:21:32.503899 4693 generic.go:334] "Generic (PLEG): container finished" podID="95ca60ae-bb88-4d40-aa72-d228efce65fd" containerID="7e06c28b49134181434d5a54da3bac316891518ee13358c848070617d228b882" exitCode=0 Nov 25 13:21:32 crc kubenswrapper[4693]: I1125 13:21:32.504200 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v45fj" event={"ID":"95ca60ae-bb88-4d40-aa72-d228efce65fd","Type":"ContainerDied","Data":"7e06c28b49134181434d5a54da3bac316891518ee13358c848070617d228b882"} Nov 25 13:21:32 crc kubenswrapper[4693]: I1125 13:21:32.510415 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 13:21:32 crc kubenswrapper[4693]: I1125 13:21:32.583282 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-vwwwd_a7dd6764-f6c6-4765-bdca-2f3b5dbf4e46/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:32 crc kubenswrapper[4693]: I1125 13:21:32.610958 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cx4gc_3200a40a-dfa0-40f7-a79d-054de8e9e386/ssh-known-hosts-edpm-deployment/0.log" Nov 25 13:21:32 crc kubenswrapper[4693]: I1125 13:21:32.878813 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5df97c965f-mdrk8_3bce19a4-5298-4024-b291-19e2d6138081/proxy-server/0.log" Nov 25 13:21:32 crc kubenswrapper[4693]: I1125 13:21:32.943524 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5df97c965f-mdrk8_3bce19a4-5298-4024-b291-19e2d6138081/proxy-httpd/0.log" Nov 25 13:21:32 crc kubenswrapper[4693]: I1125 13:21:32.989725 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2kzct_88ff5ba0-ea04-4e77-9f16-05711082df93/swift-ring-rebalance/0.log" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.203001 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/account-reaper/0.log" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.258156 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/account-auditor/0.log" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.299326 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/account-replicator/0.log" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.377744 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/account-server/0.log" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.483091 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/container-auditor/0.log" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.532456 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v45fj" event={"ID":"95ca60ae-bb88-4d40-aa72-d228efce65fd","Type":"ContainerStarted","Data":"950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4"} Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.562049 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v45fj" podStartSLOduration=3.117644182 podStartE2EDuration="6.562024879s" podCreationTimestamp="2025-11-25 13:21:27 +0000 UTC" firstStartedPulling="2025-11-25 13:21:29.475331043 +0000 UTC m=+4409.393416424" lastFinishedPulling="2025-11-25 13:21:32.91971175 +0000 UTC m=+4412.837797121" observedRunningTime="2025-11-25 13:21:33.550077668 +0000 UTC m=+4413.468163049" watchObservedRunningTime="2025-11-25 13:21:33.562024879 +0000 UTC m=+4413.480110260" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.572824 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/container-server/0.log" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.579707 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/container-replicator/0.log" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.619099 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/container-updater/0.log" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.794212 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/object-auditor/0.log" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.862399 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/object-server/0.log" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.863102 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/object-replicator/0.log" Nov 25 13:21:33 crc kubenswrapper[4693]: I1125 13:21:33.879086 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/object-expirer/0.log" Nov 25 13:21:34 crc kubenswrapper[4693]: I1125 13:21:34.005359 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/object-updater/0.log" Nov 25 13:21:34 crc kubenswrapper[4693]: I1125 13:21:34.079163 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/rsync/0.log" Nov 25 13:21:34 crc kubenswrapper[4693]: I1125 13:21:34.139111 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c8b28a97-55d7-41b0-aa09-55e4e132bd64/swift-recon-cron/0.log" Nov 25 13:21:34 crc kubenswrapper[4693]: I1125 13:21:34.400778 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-h89kr_ebbe9089-3f4f-46c6-a5ea-ff523e970069/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:34 crc kubenswrapper[4693]: I1125 13:21:34.427321 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_9b66ebc6-a0f0-4418-8d28-7364b1f5d177/tempest-tests-tempest-tests-runner/0.log" Nov 25 13:21:34 crc kubenswrapper[4693]: I1125 13:21:34.568099 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_54397ebd-dc91-441f-9c68-261a1c952589/test-operator-logs-container/0.log" Nov 25 13:21:35 crc kubenswrapper[4693]: I1125 13:21:35.357145 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-4mrqv_6afe8ee4-7d98-4751-a224-b99437561d70/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 25 13:21:37 crc kubenswrapper[4693]: I1125 13:21:37.812553 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:21:37 crc kubenswrapper[4693]: E1125 13:21:37.813180 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:21:38 crc kubenswrapper[4693]: I1125 13:21:38.087144 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:38 crc kubenswrapper[4693]: I1125 13:21:38.090805 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:39 crc kubenswrapper[4693]: I1125 13:21:39.146926 4693 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v45fj" podUID="95ca60ae-bb88-4d40-aa72-d228efce65fd" containerName="registry-server" probeResult="failure" output=< Nov 25 13:21:39 crc kubenswrapper[4693]: timeout: failed to connect service ":50051" within 1s Nov 25 13:21:39 crc kubenswrapper[4693]: > Nov 25 13:21:45 crc kubenswrapper[4693]: I1125 13:21:45.633479 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_459f5353-15bd-4139-a363-7a1bf6fe94cf/memcached/0.log" Nov 25 13:21:48 crc kubenswrapper[4693]: I1125 13:21:48.139205 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:48 crc kubenswrapper[4693]: I1125 13:21:48.191198 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:48 crc kubenswrapper[4693]: I1125 13:21:48.394727 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v45fj"] Nov 25 13:21:49 crc kubenswrapper[4693]: I1125 13:21:49.652789 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v45fj" podUID="95ca60ae-bb88-4d40-aa72-d228efce65fd" containerName="registry-server" containerID="cri-o://950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4" gracePeriod=2 Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.219789 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.281560 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ca60ae-bb88-4d40-aa72-d228efce65fd-catalog-content\") pod \"95ca60ae-bb88-4d40-aa72-d228efce65fd\" (UID: \"95ca60ae-bb88-4d40-aa72-d228efce65fd\") " Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.281617 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ca60ae-bb88-4d40-aa72-d228efce65fd-utilities\") pod \"95ca60ae-bb88-4d40-aa72-d228efce65fd\" (UID: \"95ca60ae-bb88-4d40-aa72-d228efce65fd\") " Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.281664 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnd99\" (UniqueName: \"kubernetes.io/projected/95ca60ae-bb88-4d40-aa72-d228efce65fd-kube-api-access-cnd99\") pod \"95ca60ae-bb88-4d40-aa72-d228efce65fd\" (UID: \"95ca60ae-bb88-4d40-aa72-d228efce65fd\") " Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.283911 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ca60ae-bb88-4d40-aa72-d228efce65fd-utilities" (OuterVolumeSpecName: "utilities") pod "95ca60ae-bb88-4d40-aa72-d228efce65fd" (UID: "95ca60ae-bb88-4d40-aa72-d228efce65fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.291857 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ca60ae-bb88-4d40-aa72-d228efce65fd-kube-api-access-cnd99" (OuterVolumeSpecName: "kube-api-access-cnd99") pod "95ca60ae-bb88-4d40-aa72-d228efce65fd" (UID: "95ca60ae-bb88-4d40-aa72-d228efce65fd"). InnerVolumeSpecName "kube-api-access-cnd99". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.384062 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ca60ae-bb88-4d40-aa72-d228efce65fd-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.384104 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnd99\" (UniqueName: \"kubernetes.io/projected/95ca60ae-bb88-4d40-aa72-d228efce65fd-kube-api-access-cnd99\") on node \"crc\" DevicePath \"\"" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.414238 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ca60ae-bb88-4d40-aa72-d228efce65fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95ca60ae-bb88-4d40-aa72-d228efce65fd" (UID: "95ca60ae-bb88-4d40-aa72-d228efce65fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.485826 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ca60ae-bb88-4d40-aa72-d228efce65fd-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.664007 4693 generic.go:334] "Generic (PLEG): container finished" podID="95ca60ae-bb88-4d40-aa72-d228efce65fd" containerID="950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4" exitCode=0 Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.664050 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v45fj" event={"ID":"95ca60ae-bb88-4d40-aa72-d228efce65fd","Type":"ContainerDied","Data":"950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4"} Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.664061 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v45fj" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.664087 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v45fj" event={"ID":"95ca60ae-bb88-4d40-aa72-d228efce65fd","Type":"ContainerDied","Data":"0fcbf6ed05185b738ad444c6837cea2ee38dae97461126b2ec43a1a939480a1b"} Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.664107 4693 scope.go:117] "RemoveContainer" containerID="950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.686560 4693 scope.go:117] "RemoveContainer" containerID="7e06c28b49134181434d5a54da3bac316891518ee13358c848070617d228b882" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.697527 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v45fj"] Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.708321 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v45fj"] Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.722809 4693 scope.go:117] "RemoveContainer" containerID="1a73343521fe9c9f2d74bb8966561407fcda750c884bc63084cf2d794144c648" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.751440 4693 scope.go:117] "RemoveContainer" containerID="950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4" Nov 25 13:21:50 crc kubenswrapper[4693]: E1125 13:21:50.751877 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4\": container with ID starting with 950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4 not found: ID does not exist" containerID="950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.751920 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4"} err="failed to get container status \"950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4\": rpc error: code = NotFound desc = could not find container \"950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4\": container with ID starting with 950ca20b2a942f622897be77cd90d15f9081544ce22131c42bcbdc51fe978df4 not found: ID does not exist" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.751946 4693 scope.go:117] "RemoveContainer" containerID="7e06c28b49134181434d5a54da3bac316891518ee13358c848070617d228b882" Nov 25 13:21:50 crc kubenswrapper[4693]: E1125 13:21:50.752348 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e06c28b49134181434d5a54da3bac316891518ee13358c848070617d228b882\": container with ID starting with 7e06c28b49134181434d5a54da3bac316891518ee13358c848070617d228b882 not found: ID does not exist" containerID="7e06c28b49134181434d5a54da3bac316891518ee13358c848070617d228b882" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.752397 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e06c28b49134181434d5a54da3bac316891518ee13358c848070617d228b882"} err="failed to get container status \"7e06c28b49134181434d5a54da3bac316891518ee13358c848070617d228b882\": rpc error: code = NotFound desc = could not find container \"7e06c28b49134181434d5a54da3bac316891518ee13358c848070617d228b882\": container with ID starting with 7e06c28b49134181434d5a54da3bac316891518ee13358c848070617d228b882 not found: ID does not exist" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.752419 4693 scope.go:117] "RemoveContainer" containerID="1a73343521fe9c9f2d74bb8966561407fcda750c884bc63084cf2d794144c648" Nov 25 13:21:50 crc kubenswrapper[4693]: E1125 13:21:50.752636 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a73343521fe9c9f2d74bb8966561407fcda750c884bc63084cf2d794144c648\": container with ID starting with 1a73343521fe9c9f2d74bb8966561407fcda750c884bc63084cf2d794144c648 not found: ID does not exist" containerID="1a73343521fe9c9f2d74bb8966561407fcda750c884bc63084cf2d794144c648" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.752660 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a73343521fe9c9f2d74bb8966561407fcda750c884bc63084cf2d794144c648"} err="failed to get container status \"1a73343521fe9c9f2d74bb8966561407fcda750c884bc63084cf2d794144c648\": rpc error: code = NotFound desc = could not find container \"1a73343521fe9c9f2d74bb8966561407fcda750c884bc63084cf2d794144c648\": container with ID starting with 1a73343521fe9c9f2d74bb8966561407fcda750c884bc63084cf2d794144c648 not found: ID does not exist" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.819933 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:21:50 crc kubenswrapper[4693]: E1125 13:21:50.820399 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:21:50 crc kubenswrapper[4693]: I1125 13:21:50.824077 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ca60ae-bb88-4d40-aa72-d228efce65fd" path="/var/lib/kubelet/pods/95ca60ae-bb88-4d40-aa72-d228efce65fd/volumes" Nov 25 13:22:02 crc kubenswrapper[4693]: I1125 13:22:02.280733 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-6wxtj_2f11c884-15fc-4e2a-a533-d0eac0639f80/kube-rbac-proxy/0.log" Nov 25 13:22:02 crc kubenswrapper[4693]: I1125 13:22:02.341589 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-6wxtj_2f11c884-15fc-4e2a-a533-d0eac0639f80/manager/2.log" Nov 25 13:22:02 crc kubenswrapper[4693]: I1125 13:22:02.508971 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-86dc4d89c8-6wxtj_2f11c884-15fc-4e2a-a533-d0eac0639f80/manager/1.log" Nov 25 13:22:02 crc kubenswrapper[4693]: I1125 13:22:02.528273 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/util/0.log" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.318966 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/pull/0.log" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.362809 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/util/0.log" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.363855 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/pull/0.log" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.496295 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/pull/0.log" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.530857 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/util/0.log" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.561235 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_bbe0292a041351b2e91c74017e768208b36f144dd799fdf82c414fd15fw8swk_7c2f26eb-e680-4a45-8e01-bf653f711b07/extract/0.log" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.696282 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-4lt8v_4ab70f55-282f-4509-bc36-71ef2fe4d35b/kube-rbac-proxy/0.log" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.699480 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-4lt8v_4ab70f55-282f-4509-bc36-71ef2fe4d35b/manager/2.log" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.740991 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79856dc55c-4lt8v_4ab70f55-282f-4509-bc36-71ef2fe4d35b/manager/1.log" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.813816 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:22:03 crc kubenswrapper[4693]: E1125 13:22:03.814118 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.886786 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-6dtx6_9cc5c4a9-0119-48b6-a795-9f482b55278b/kube-rbac-proxy/0.log" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.919016 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-6dtx6_9cc5c4a9-0119-48b6-a795-9f482b55278b/manager/2.log" Nov 25 13:22:03 crc kubenswrapper[4693]: I1125 13:22:03.927628 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-7d695c9b56-6dtx6_9cc5c4a9-0119-48b6-a795-9f482b55278b/manager/1.log" Nov 25 13:22:04 crc kubenswrapper[4693]: I1125 13:22:04.058347 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-866fd_7cb65a4e-3294-4104-b3bf-6d1103b92c38/kube-rbac-proxy/0.log" Nov 25 13:22:04 crc kubenswrapper[4693]: I1125 13:22:04.085814 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-866fd_7cb65a4e-3294-4104-b3bf-6d1103b92c38/manager/2.log" Nov 25 13:22:04 crc kubenswrapper[4693]: I1125 13:22:04.129516 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-68b95954c9-866fd_7cb65a4e-3294-4104-b3bf-6d1103b92c38/manager/1.log" Nov 25 13:22:04 crc kubenswrapper[4693]: I1125 13:22:04.226113 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-nzz29_b29c9c21-026a-4701-99a7-769d382a2da2/kube-rbac-proxy/0.log" Nov 25 13:22:04 crc kubenswrapper[4693]: I1125 13:22:04.276940 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-nzz29_b29c9c21-026a-4701-99a7-769d382a2da2/manager/2.log" Nov 25 13:22:04 crc kubenswrapper[4693]: I1125 13:22:04.337742 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-774b86978c-nzz29_b29c9c21-026a-4701-99a7-769d382a2da2/manager/1.log" Nov 25 13:22:04 crc kubenswrapper[4693]: I1125 13:22:04.415790 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-fwwsj_4dd9cd53-1f66-4636-9fab-9f0b3ff38009/kube-rbac-proxy/0.log" Nov 25 13:22:04 crc kubenswrapper[4693]: I1125 13:22:04.502092 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-fwwsj_4dd9cd53-1f66-4636-9fab-9f0b3ff38009/manager/2.log" Nov 25 13:22:04 crc kubenswrapper[4693]: I1125 13:22:04.546815 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c9694994-fwwsj_4dd9cd53-1f66-4636-9fab-9f0b3ff38009/manager/1.log" Nov 25 13:22:04 crc kubenswrapper[4693]: I1125 13:22:04.624505 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-r86ct_5c98082e-070e-42b1-afdc-69cea132629e/kube-rbac-proxy/0.log" Nov 25 13:22:05 crc kubenswrapper[4693]: I1125 13:22:05.388316 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-r86ct_5c98082e-070e-42b1-afdc-69cea132629e/manager/1.log" Nov 25 13:22:05 crc kubenswrapper[4693]: I1125 13:22:05.418195 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-d5cc86f4b-r86ct_5c98082e-070e-42b1-afdc-69cea132629e/manager/2.log" Nov 25 13:22:05 crc kubenswrapper[4693]: I1125 13:22:05.438115 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-szrv4_3c29e8b9-57cf-4967-b5e2-a6af42c16099/kube-rbac-proxy/0.log" Nov 25 13:22:05 crc kubenswrapper[4693]: I1125 13:22:05.553674 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-szrv4_3c29e8b9-57cf-4967-b5e2-a6af42c16099/manager/2.log" Nov 25 13:22:05 crc kubenswrapper[4693]: I1125 13:22:05.610739 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bfcdc958c-szrv4_3c29e8b9-57cf-4967-b5e2-a6af42c16099/manager/1.log" Nov 25 13:22:05 crc kubenswrapper[4693]: I1125 13:22:05.612515 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-zcpsz_a64b0f5c-e6af-4903-925a-028aec5477fd/kube-rbac-proxy/0.log" Nov 25 13:22:05 crc kubenswrapper[4693]: I1125 13:22:05.651111 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-zcpsz_a64b0f5c-e6af-4903-925a-028aec5477fd/manager/2.log" Nov 25 13:22:05 crc kubenswrapper[4693]: I1125 13:22:05.747215 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-748dc6576f-zcpsz_a64b0f5c-e6af-4903-925a-028aec5477fd/manager/1.log" Nov 25 13:22:05 crc kubenswrapper[4693]: I1125 13:22:05.829181 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-5ghnq_bfeee7c1-207f-4862-b172-f2ffab4a1500/manager/2.log" Nov 25 13:22:05 crc kubenswrapper[4693]: I1125 13:22:05.833452 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-5ghnq_bfeee7c1-207f-4862-b172-f2ffab4a1500/kube-rbac-proxy/0.log" Nov 25 13:22:05 crc kubenswrapper[4693]: I1125 13:22:05.885007 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58bb8d67cc-5ghnq_bfeee7c1-207f-4862-b172-f2ffab4a1500/manager/1.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:05.999965 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-s9shw_22a83ecc-1f72-4474-a470-2ee4bef7eddf/kube-rbac-proxy/0.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.051719 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-s9shw_22a83ecc-1f72-4474-a470-2ee4bef7eddf/manager/1.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.057949 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-cb6c4fdb7-s9shw_22a83ecc-1f72-4474-a470-2ee4bef7eddf/manager/2.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.295032 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-csrpt_0f35f544-581e-4cb2-900f-71213e27477d/kube-rbac-proxy/0.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.313828 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-csrpt_0f35f544-581e-4cb2-900f-71213e27477d/manager/2.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.347166 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7c57c8bbc4-csrpt_0f35f544-581e-4cb2-900f-71213e27477d/manager/1.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.367122 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-flxdz_7ecc8c23-d9b2-4d46-a8b0-76758035b267/kube-rbac-proxy/0.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.509907 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-flxdz_7ecc8c23-d9b2-4d46-a8b0-76758035b267/manager/1.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.539705 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-g972v_fe2a0074-66dc-4730-9321-772ee8fd8e28/kube-rbac-proxy/0.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.550335 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-flxdz_7ecc8c23-d9b2-4d46-a8b0-76758035b267/manager/2.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.574922 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-g972v_fe2a0074-66dc-4730-9321-772ee8fd8e28/manager/2.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.691928 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-fd75fd47d-g972v_fe2a0074-66dc-4730-9321-772ee8fd8e28/manager/1.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.716828 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b58f89467-jlbhg_a7c4eb9b-38af-41da-872e-b3da515b2f88/kube-rbac-proxy/0.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.759672 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b58f89467-jlbhg_a7c4eb9b-38af-41da-872e-b3da515b2f88/manager/1.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.788631 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-b58f89467-jlbhg_a7c4eb9b-38af-41da-872e-b3da515b2f88/manager/0.log" Nov 25 13:22:06 crc kubenswrapper[4693]: I1125 13:22:06.948847 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cd5954d9-rqjq9_c80a0f65-6193-435f-8138-eb5a4ba71b22/manager/1.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.021178 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7b567956b5-gk28d_ebf85cb6-2651-4b5f-9cbe-973db55e14c5/operator/1.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.126771 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7cd5954d9-rqjq9_c80a0f65-6193-435f-8138-eb5a4ba71b22/manager/2.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.268198 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7fm7p_b14001de-fa88-4632-87e8-e5a4d703e633/registry-server/0.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.308477 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7b567956b5-gk28d_ebf85cb6-2651-4b5f-9cbe-973db55e14c5/operator/0.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.337851 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-k2njb_1c7db975-17d7-48dd-8e5a-0549749ab866/kube-rbac-proxy/0.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.413703 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-k2njb_1c7db975-17d7-48dd-8e5a-0549749ab866/manager/2.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.471588 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-66cf5c67ff-k2njb_1c7db975-17d7-48dd-8e5a-0549749ab866/manager/1.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.492362 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-f4trp_f6bc1c64-200f-492f-bad9-dfecd5687698/manager/2.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.510715 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-f4trp_f6bc1c64-200f-492f-bad9-dfecd5687698/kube-rbac-proxy/0.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.601615 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5db546f9d9-f4trp_f6bc1c64-200f-492f-bad9-dfecd5687698/manager/1.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.677037 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-qbjp2_28782f20-4534-4137-b590-7a3b31c638b2/operator/1.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.720209 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-qbjp2_28782f20-4534-4137-b590-7a3b31c638b2/operator/2.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.863636 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-bnf27_c3a7c8cb-ac3c-43d3-b38d-0c3625c53196/kube-rbac-proxy/0.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.881783 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-bnf27_c3a7c8cb-ac3c-43d3-b38d-0c3625c53196/manager/1.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.928325 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6fdc4fcf86-bnf27_c3a7c8cb-ac3c-43d3-b38d-0c3625c53196/manager/2.log" Nov 25 13:22:07 crc kubenswrapper[4693]: I1125 13:22:07.940410 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-cwrvs_b9227546-dcce-4b09-9311-19f844deb318/kube-rbac-proxy/0.log" Nov 25 13:22:08 crc kubenswrapper[4693]: I1125 13:22:08.070746 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-cwrvs_b9227546-dcce-4b09-9311-19f844deb318/manager/2.log" Nov 25 13:22:08 crc kubenswrapper[4693]: I1125 13:22:08.103203 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-567f98c9d-cwrvs_b9227546-dcce-4b09-9311-19f844deb318/manager/1.log" Nov 25 13:22:08 crc kubenswrapper[4693]: I1125 13:22:08.125430 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-kmpm8_ef0b302b-05d0-4be3-85ad-7eb3d70cec36/kube-rbac-proxy/0.log" Nov 25 13:22:08 crc kubenswrapper[4693]: I1125 13:22:08.165775 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-kmpm8_ef0b302b-05d0-4be3-85ad-7eb3d70cec36/manager/1.log" Nov 25 13:22:08 crc kubenswrapper[4693]: I1125 13:22:08.270078 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cb74df96-kmpm8_ef0b302b-05d0-4be3-85ad-7eb3d70cec36/manager/0.log" Nov 25 13:22:08 crc kubenswrapper[4693]: I1125 13:22:08.315160 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-tc9jb_105791fd-407d-44a3-8fc8-af90e82b0f63/kube-rbac-proxy/0.log" Nov 25 13:22:08 crc kubenswrapper[4693]: I1125 13:22:08.404800 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-tc9jb_105791fd-407d-44a3-8fc8-af90e82b0f63/manager/2.log" Nov 25 13:22:08 crc kubenswrapper[4693]: I1125 13:22:08.406888 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-864885998-tc9jb_105791fd-407d-44a3-8fc8-af90e82b0f63/manager/1.log" Nov 25 13:22:18 crc kubenswrapper[4693]: I1125 13:22:18.820701 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:22:18 crc kubenswrapper[4693]: E1125 13:22:18.821537 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:22:26 crc kubenswrapper[4693]: I1125 13:22:26.888279 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-5v4pg_264f1d17-cf59-4dbf-ad2f-0272713fe3b0/control-plane-machine-set-operator/0.log" Nov 25 13:22:27 crc kubenswrapper[4693]: I1125 13:22:27.094208 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mkt5h_bccd7dbe-e658-4ce4-be99-b6642a5bb498/kube-rbac-proxy/0.log" Nov 25 13:22:27 crc kubenswrapper[4693]: I1125 13:22:27.121589 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mkt5h_bccd7dbe-e658-4ce4-be99-b6642a5bb498/machine-api-operator/0.log" Nov 25 13:22:29 crc kubenswrapper[4693]: I1125 13:22:29.812553 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:22:29 crc kubenswrapper[4693]: E1125 13:22:29.813037 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:22:40 crc kubenswrapper[4693]: I1125 13:22:40.066013 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-zc5h2_b9284e69-f82a-44ea-bee3-627c08d1d86c/cert-manager-controller/0.log" Nov 25 13:22:40 crc kubenswrapper[4693]: I1125 13:22:40.251820 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-krwsc_886fc2dd-e1c6-4822-b516-1540c9e77f39/cert-manager-cainjector/0.log" Nov 25 13:22:40 crc kubenswrapper[4693]: I1125 13:22:40.289127 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-krwsc_886fc2dd-e1c6-4822-b516-1540c9e77f39/cert-manager-cainjector/1.log" Nov 25 13:22:40 crc kubenswrapper[4693]: I1125 13:22:40.334650 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-94qfr_6fff8a61-6848-4e20-bc9b-cc0d8e4299d4/cert-manager-webhook/0.log" Nov 25 13:22:43 crc kubenswrapper[4693]: I1125 13:22:43.813243 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:22:43 crc kubenswrapper[4693]: E1125 13:22:43.814666 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:22:52 crc kubenswrapper[4693]: I1125 13:22:52.690809 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-wc25g_fef9b8d4-8a67-486c-84d4-f0053c7efe32/nmstate-console-plugin/0.log" Nov 25 13:22:52 crc kubenswrapper[4693]: I1125 13:22:52.856641 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-k95kj_b2d6c353-42d6-4c35-8c14-925f97540979/nmstate-handler/0.log" Nov 25 13:22:52 crc kubenswrapper[4693]: I1125 13:22:52.934052 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-sxzv8_0a6d6078-b39e-4528-a765-5624dee71294/nmstate-metrics/0.log" Nov 25 13:22:52 crc kubenswrapper[4693]: I1125 13:22:52.934166 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-sxzv8_0a6d6078-b39e-4528-a765-5624dee71294/kube-rbac-proxy/0.log" Nov 25 13:22:53 crc kubenswrapper[4693]: I1125 13:22:53.132551 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-vv6jm_d8ae7877-8f8c-4fb0-bb42-ec809dcb6d4d/nmstate-operator/0.log" Nov 25 13:22:53 crc kubenswrapper[4693]: I1125 13:22:53.156966 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-wv8pz_27d9ec74-a9f1-4971-a6ad-16703ad324ad/nmstate-webhook/0.log" Nov 25 13:22:55 crc kubenswrapper[4693]: I1125 13:22:55.812983 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:22:55 crc kubenswrapper[4693]: E1125 13:22:55.813332 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:23:06 crc kubenswrapper[4693]: I1125 13:23:06.812986 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:23:06 crc kubenswrapper[4693]: E1125 13:23:06.813790 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.029449 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-m86lr_89cc79c1-2d72-47b8-abcb-14af4fb9afe7/kube-rbac-proxy/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.133532 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-m86lr_89cc79c1-2d72-47b8-abcb-14af4fb9afe7/controller/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.234095 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-frr-files/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.379923 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-frr-files/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.390388 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-reloader/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.437387 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-metrics/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.487791 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-reloader/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.662339 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-metrics/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.669610 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-frr-files/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.671581 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-reloader/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.701240 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-metrics/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.869143 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-frr-files/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.870277 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-metrics/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.889849 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/controller/0.log" Nov 25 13:23:07 crc kubenswrapper[4693]: I1125 13:23:07.918995 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/cp-reloader/0.log" Nov 25 13:23:08 crc kubenswrapper[4693]: I1125 13:23:08.068813 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/frr-metrics/0.log" Nov 25 13:23:08 crc kubenswrapper[4693]: I1125 13:23:08.105858 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/kube-rbac-proxy-frr/0.log" Nov 25 13:23:08 crc kubenswrapper[4693]: I1125 13:23:08.123387 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/kube-rbac-proxy/0.log" Nov 25 13:23:08 crc kubenswrapper[4693]: I1125 13:23:08.312072 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/reloader/0.log" Nov 25 13:23:08 crc kubenswrapper[4693]: I1125 13:23:08.320361 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-54csl_b92f94fa-96a8-4257-890e-076b4292b487/frr-k8s-webhook-server/0.log" Nov 25 13:23:08 crc kubenswrapper[4693]: I1125 13:23:08.549793 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5995bbfc5f-c8gkc_0d2b9e6f-fe11-47e3-af7b-cca0fff65798/manager/3.log" Nov 25 13:23:08 crc kubenswrapper[4693]: I1125 13:23:08.573279 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5995bbfc5f-c8gkc_0d2b9e6f-fe11-47e3-af7b-cca0fff65798/manager/2.log" Nov 25 13:23:08 crc kubenswrapper[4693]: I1125 13:23:08.740197 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7b574576ff-z9ftm_9c9e4728-76ad-4ae9-8ef9-87cff7db96c3/webhook-server/0.log" Nov 25 13:23:09 crc kubenswrapper[4693]: I1125 13:23:09.085330 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dnzwb_667fdb6a-4e0e-4b92-ae50-aa1880c69402/kube-rbac-proxy/0.log" Nov 25 13:23:09 crc kubenswrapper[4693]: I1125 13:23:09.579326 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-dnzwb_667fdb6a-4e0e-4b92-ae50-aa1880c69402/speaker/0.log" Nov 25 13:23:09 crc kubenswrapper[4693]: I1125 13:23:09.616521 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-tz5hq_43653fbc-4dc9-437e-a5f7-8cc774881d8a/frr/0.log" Nov 25 13:23:18 crc kubenswrapper[4693]: I1125 13:23:18.812920 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:23:18 crc kubenswrapper[4693]: E1125 13:23:18.814636 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:23:22 crc kubenswrapper[4693]: I1125 13:23:22.701270 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/util/0.log" Nov 25 13:23:23 crc kubenswrapper[4693]: I1125 13:23:23.469068 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/util/0.log" Nov 25 13:23:23 crc kubenswrapper[4693]: I1125 13:23:23.469550 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/pull/0.log" Nov 25 13:23:23 crc kubenswrapper[4693]: I1125 13:23:23.496428 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/pull/0.log" Nov 25 13:23:23 crc kubenswrapper[4693]: I1125 13:23:23.638111 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/pull/0.log" Nov 25 13:23:23 crc kubenswrapper[4693]: I1125 13:23:23.679820 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/util/0.log" Nov 25 13:23:23 crc kubenswrapper[4693]: I1125 13:23:23.717200 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9mb8q_47fe04a3-31d1-4c8f-bccd-109447168f70/extract/0.log" Nov 25 13:23:23 crc kubenswrapper[4693]: I1125 13:23:23.833749 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/extract-utilities/0.log" Nov 25 13:23:24 crc kubenswrapper[4693]: I1125 13:23:24.010786 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/extract-utilities/0.log" Nov 25 13:23:24 crc kubenswrapper[4693]: I1125 13:23:24.029057 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/extract-content/0.log" Nov 25 13:23:24 crc kubenswrapper[4693]: I1125 13:23:24.075056 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/extract-content/0.log" Nov 25 13:23:24 crc kubenswrapper[4693]: I1125 13:23:24.165317 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/extract-content/0.log" Nov 25 13:23:24 crc kubenswrapper[4693]: I1125 13:23:24.167816 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/extract-utilities/0.log" Nov 25 13:23:24 crc kubenswrapper[4693]: I1125 13:23:24.401495 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/extract-utilities/0.log" Nov 25 13:23:24 crc kubenswrapper[4693]: I1125 13:23:24.933469 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/extract-utilities/0.log" Nov 25 13:23:24 crc kubenswrapper[4693]: I1125 13:23:24.954170 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bd7mr_5ca5d5dc-02ea-48c2-9a3e-944359d44d84/registry-server/0.log" Nov 25 13:23:24 crc kubenswrapper[4693]: I1125 13:23:24.998013 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/extract-content/0.log" Nov 25 13:23:25 crc kubenswrapper[4693]: I1125 13:23:25.070412 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/extract-content/0.log" Nov 25 13:23:25 crc kubenswrapper[4693]: I1125 13:23:25.148595 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/extract-utilities/0.log" Nov 25 13:23:25 crc kubenswrapper[4693]: I1125 13:23:25.200278 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/extract-content/0.log" Nov 25 13:23:25 crc kubenswrapper[4693]: I1125 13:23:25.509517 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/util/0.log" Nov 25 13:23:25 crc kubenswrapper[4693]: I1125 13:23:25.722109 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/pull/0.log" Nov 25 13:23:25 crc kubenswrapper[4693]: I1125 13:23:25.758276 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/pull/0.log" Nov 25 13:23:25 crc kubenswrapper[4693]: I1125 13:23:25.776155 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/util/0.log" Nov 25 13:23:25 crc kubenswrapper[4693]: I1125 13:23:25.904368 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qkzhq_24c65baa-b858-4f7f-8d19-a2e6ce7019a6/registry-server/0.log" Nov 25 13:23:25 crc kubenswrapper[4693]: I1125 13:23:25.984619 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/util/0.log" Nov 25 13:23:25 crc kubenswrapper[4693]: I1125 13:23:25.986607 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/extract/0.log" Nov 25 13:23:25 crc kubenswrapper[4693]: I1125 13:23:25.999139 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6lbj8m_f6fa2a73-3c18-4d17-8c57-1698fa8d987b/pull/0.log" Nov 25 13:23:26 crc kubenswrapper[4693]: I1125 13:23:26.170069 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-87z2l_c97ed1b2-4d1e-45f8-9aa7-67336324d2cc/marketplace-operator/0.log" Nov 25 13:23:26 crc kubenswrapper[4693]: I1125 13:23:26.201159 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lm42v_f1511af3-3ca9-439c-b941-18a792f99932/extract-utilities/0.log" Nov 25 13:23:26 crc kubenswrapper[4693]: I1125 13:23:26.357779 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lm42v_f1511af3-3ca9-439c-b941-18a792f99932/extract-utilities/0.log" Nov 25 13:23:26 crc kubenswrapper[4693]: I1125 13:23:26.363519 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lm42v_f1511af3-3ca9-439c-b941-18a792f99932/extract-content/0.log" Nov 25 13:23:26 crc kubenswrapper[4693]: I1125 13:23:26.379569 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lm42v_f1511af3-3ca9-439c-b941-18a792f99932/extract-content/0.log" Nov 25 13:23:26 crc kubenswrapper[4693]: I1125 13:23:26.543892 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lm42v_f1511af3-3ca9-439c-b941-18a792f99932/extract-utilities/0.log" Nov 25 13:23:26 crc kubenswrapper[4693]: I1125 13:23:26.546896 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lm42v_f1511af3-3ca9-439c-b941-18a792f99932/extract-content/0.log" Nov 25 13:23:26 crc kubenswrapper[4693]: I1125 13:23:26.588957 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lm42v_f1511af3-3ca9-439c-b941-18a792f99932/registry-server/0.log" Nov 25 13:23:26 crc kubenswrapper[4693]: I1125 13:23:26.597892 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/extract-utilities/0.log" Nov 25 13:23:26 crc kubenswrapper[4693]: I1125 13:23:26.807357 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/extract-content/0.log" Nov 25 13:23:26 crc kubenswrapper[4693]: I1125 13:23:26.808150 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/extract-utilities/0.log" Nov 25 13:23:26 crc kubenswrapper[4693]: I1125 13:23:26.825837 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/extract-content/0.log" Nov 25 13:23:27 crc kubenswrapper[4693]: I1125 13:23:27.014826 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/extract-content/0.log" Nov 25 13:23:27 crc kubenswrapper[4693]: I1125 13:23:27.018318 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/extract-utilities/0.log" Nov 25 13:23:27 crc kubenswrapper[4693]: I1125 13:23:27.205071 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rt9rp_a31b362f-c747-4bf0-bcce-27a2761b95e6/registry-server/0.log" Nov 25 13:23:32 crc kubenswrapper[4693]: I1125 13:23:32.813105 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:23:32 crc kubenswrapper[4693]: E1125 13:23:32.813949 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:23:47 crc kubenswrapper[4693]: I1125 13:23:47.845360 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:23:47 crc kubenswrapper[4693]: E1125 13:23:47.846235 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:23:58 crc kubenswrapper[4693]: I1125 13:23:58.813328 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:23:58 crc kubenswrapper[4693]: E1125 13:23:58.814155 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:24:11 crc kubenswrapper[4693]: I1125 13:24:11.815867 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:24:11 crc kubenswrapper[4693]: E1125 13:24:11.817215 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:24:23 crc kubenswrapper[4693]: I1125 13:24:23.813772 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:24:23 crc kubenswrapper[4693]: E1125 13:24:23.814757 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.729419 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-86xln"] Nov 25 13:24:24 crc kubenswrapper[4693]: E1125 13:24:24.729935 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ca60ae-bb88-4d40-aa72-d228efce65fd" containerName="extract-utilities" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.729962 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ca60ae-bb88-4d40-aa72-d228efce65fd" containerName="extract-utilities" Nov 25 13:24:24 crc kubenswrapper[4693]: E1125 13:24:24.729987 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ca60ae-bb88-4d40-aa72-d228efce65fd" containerName="extract-content" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.729996 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ca60ae-bb88-4d40-aa72-d228efce65fd" containerName="extract-content" Nov 25 13:24:24 crc kubenswrapper[4693]: E1125 13:24:24.730006 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ca60ae-bb88-4d40-aa72-d228efce65fd" containerName="registry-server" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.730014 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ca60ae-bb88-4d40-aa72-d228efce65fd" containerName="registry-server" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.730246 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ca60ae-bb88-4d40-aa72-d228efce65fd" containerName="registry-server" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.732396 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.775704 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86xln"] Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.897321 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9b6ba07-99d5-4468-812f-0721d5f92c88-catalog-content\") pod \"community-operators-86xln\" (UID: \"f9b6ba07-99d5-4468-812f-0721d5f92c88\") " pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.897528 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9b6ba07-99d5-4468-812f-0721d5f92c88-utilities\") pod \"community-operators-86xln\" (UID: \"f9b6ba07-99d5-4468-812f-0721d5f92c88\") " pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.897567 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfkt9\" (UniqueName: \"kubernetes.io/projected/f9b6ba07-99d5-4468-812f-0721d5f92c88-kube-api-access-rfkt9\") pod \"community-operators-86xln\" (UID: \"f9b6ba07-99d5-4468-812f-0721d5f92c88\") " pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.998937 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9b6ba07-99d5-4468-812f-0721d5f92c88-utilities\") pod \"community-operators-86xln\" (UID: \"f9b6ba07-99d5-4468-812f-0721d5f92c88\") " pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.998988 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfkt9\" (UniqueName: \"kubernetes.io/projected/f9b6ba07-99d5-4468-812f-0721d5f92c88-kube-api-access-rfkt9\") pod \"community-operators-86xln\" (UID: \"f9b6ba07-99d5-4468-812f-0721d5f92c88\") " pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.999068 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9b6ba07-99d5-4468-812f-0721d5f92c88-catalog-content\") pod \"community-operators-86xln\" (UID: \"f9b6ba07-99d5-4468-812f-0721d5f92c88\") " pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.999521 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9b6ba07-99d5-4468-812f-0721d5f92c88-utilities\") pod \"community-operators-86xln\" (UID: \"f9b6ba07-99d5-4468-812f-0721d5f92c88\") " pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:24 crc kubenswrapper[4693]: I1125 13:24:24.999541 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9b6ba07-99d5-4468-812f-0721d5f92c88-catalog-content\") pod \"community-operators-86xln\" (UID: \"f9b6ba07-99d5-4468-812f-0721d5f92c88\") " pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:25 crc kubenswrapper[4693]: I1125 13:24:25.020466 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfkt9\" (UniqueName: \"kubernetes.io/projected/f9b6ba07-99d5-4468-812f-0721d5f92c88-kube-api-access-rfkt9\") pod \"community-operators-86xln\" (UID: \"f9b6ba07-99d5-4468-812f-0721d5f92c88\") " pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:25 crc kubenswrapper[4693]: I1125 13:24:25.073174 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:25 crc kubenswrapper[4693]: W1125 13:24:25.600254 4693 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9b6ba07_99d5_4468_812f_0721d5f92c88.slice/crio-538eccda58efd0e21d4c7295569bf3e30e84ec6319be9c24ef261b5286f400e7 WatchSource:0}: Error finding container 538eccda58efd0e21d4c7295569bf3e30e84ec6319be9c24ef261b5286f400e7: Status 404 returned error can't find the container with id 538eccda58efd0e21d4c7295569bf3e30e84ec6319be9c24ef261b5286f400e7 Nov 25 13:24:25 crc kubenswrapper[4693]: I1125 13:24:25.607774 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-86xln"] Nov 25 13:24:26 crc kubenswrapper[4693]: I1125 13:24:26.365980 4693 generic.go:334] "Generic (PLEG): container finished" podID="f9b6ba07-99d5-4468-812f-0721d5f92c88" containerID="5b07291fb1569fc9693ef48e4256cc0912752ae08d7329aaa58addbed95c66d4" exitCode=0 Nov 25 13:24:26 crc kubenswrapper[4693]: I1125 13:24:26.366585 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86xln" event={"ID":"f9b6ba07-99d5-4468-812f-0721d5f92c88","Type":"ContainerDied","Data":"5b07291fb1569fc9693ef48e4256cc0912752ae08d7329aaa58addbed95c66d4"} Nov 25 13:24:26 crc kubenswrapper[4693]: I1125 13:24:26.366618 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86xln" event={"ID":"f9b6ba07-99d5-4468-812f-0721d5f92c88","Type":"ContainerStarted","Data":"538eccda58efd0e21d4c7295569bf3e30e84ec6319be9c24ef261b5286f400e7"} Nov 25 13:24:28 crc kubenswrapper[4693]: I1125 13:24:28.388317 4693 generic.go:334] "Generic (PLEG): container finished" podID="f9b6ba07-99d5-4468-812f-0721d5f92c88" containerID="130ddb6f329abeb7ac29917c9efbdebe149cbaadaf19b510e59c039a951f275d" exitCode=0 Nov 25 13:24:28 crc kubenswrapper[4693]: I1125 13:24:28.388496 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86xln" event={"ID":"f9b6ba07-99d5-4468-812f-0721d5f92c88","Type":"ContainerDied","Data":"130ddb6f329abeb7ac29917c9efbdebe149cbaadaf19b510e59c039a951f275d"} Nov 25 13:24:29 crc kubenswrapper[4693]: I1125 13:24:29.406348 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86xln" event={"ID":"f9b6ba07-99d5-4468-812f-0721d5f92c88","Type":"ContainerStarted","Data":"e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e"} Nov 25 13:24:29 crc kubenswrapper[4693]: I1125 13:24:29.435050 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-86xln" podStartSLOduration=2.763274522 podStartE2EDuration="5.435028571s" podCreationTimestamp="2025-11-25 13:24:24 +0000 UTC" firstStartedPulling="2025-11-25 13:24:26.369339268 +0000 UTC m=+4586.287424649" lastFinishedPulling="2025-11-25 13:24:29.041093317 +0000 UTC m=+4588.959178698" observedRunningTime="2025-11-25 13:24:29.426201683 +0000 UTC m=+4589.344287074" watchObservedRunningTime="2025-11-25 13:24:29.435028571 +0000 UTC m=+4589.353113962" Nov 25 13:24:34 crc kubenswrapper[4693]: I1125 13:24:34.813538 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:24:34 crc kubenswrapper[4693]: E1125 13:24:34.814738 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:24:35 crc kubenswrapper[4693]: I1125 13:24:35.073532 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:35 crc kubenswrapper[4693]: I1125 13:24:35.073614 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:35 crc kubenswrapper[4693]: I1125 13:24:35.129843 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:35 crc kubenswrapper[4693]: I1125 13:24:35.517428 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:35 crc kubenswrapper[4693]: I1125 13:24:35.569996 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86xln"] Nov 25 13:24:37 crc kubenswrapper[4693]: I1125 13:24:37.488514 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-86xln" podUID="f9b6ba07-99d5-4468-812f-0721d5f92c88" containerName="registry-server" containerID="cri-o://e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e" gracePeriod=2 Nov 25 13:24:37 crc kubenswrapper[4693]: E1125 13:24:37.721623 4693 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9b6ba07_99d5_4468_812f_0721d5f92c88.slice/crio-e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e.scope\": RecentStats: unable to find data in memory cache]" Nov 25 13:24:37 crc kubenswrapper[4693]: I1125 13:24:37.958232 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.044455 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfkt9\" (UniqueName: \"kubernetes.io/projected/f9b6ba07-99d5-4468-812f-0721d5f92c88-kube-api-access-rfkt9\") pod \"f9b6ba07-99d5-4468-812f-0721d5f92c88\" (UID: \"f9b6ba07-99d5-4468-812f-0721d5f92c88\") " Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.044764 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9b6ba07-99d5-4468-812f-0721d5f92c88-utilities\") pod \"f9b6ba07-99d5-4468-812f-0721d5f92c88\" (UID: \"f9b6ba07-99d5-4468-812f-0721d5f92c88\") " Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.044922 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9b6ba07-99d5-4468-812f-0721d5f92c88-catalog-content\") pod \"f9b6ba07-99d5-4468-812f-0721d5f92c88\" (UID: \"f9b6ba07-99d5-4468-812f-0721d5f92c88\") " Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.045855 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9b6ba07-99d5-4468-812f-0721d5f92c88-utilities" (OuterVolumeSpecName: "utilities") pod "f9b6ba07-99d5-4468-812f-0721d5f92c88" (UID: "f9b6ba07-99d5-4468-812f-0721d5f92c88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.052304 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b6ba07-99d5-4468-812f-0721d5f92c88-kube-api-access-rfkt9" (OuterVolumeSpecName: "kube-api-access-rfkt9") pod "f9b6ba07-99d5-4468-812f-0721d5f92c88" (UID: "f9b6ba07-99d5-4468-812f-0721d5f92c88"). InnerVolumeSpecName "kube-api-access-rfkt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.108808 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9b6ba07-99d5-4468-812f-0721d5f92c88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9b6ba07-99d5-4468-812f-0721d5f92c88" (UID: "f9b6ba07-99d5-4468-812f-0721d5f92c88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.147488 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9b6ba07-99d5-4468-812f-0721d5f92c88-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.147534 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfkt9\" (UniqueName: \"kubernetes.io/projected/f9b6ba07-99d5-4468-812f-0721d5f92c88-kube-api-access-rfkt9\") on node \"crc\" DevicePath \"\"" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.147551 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9b6ba07-99d5-4468-812f-0721d5f92c88-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.499278 4693 generic.go:334] "Generic (PLEG): container finished" podID="f9b6ba07-99d5-4468-812f-0721d5f92c88" containerID="e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e" exitCode=0 Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.499326 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86xln" event={"ID":"f9b6ba07-99d5-4468-812f-0721d5f92c88","Type":"ContainerDied","Data":"e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e"} Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.499356 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-86xln" event={"ID":"f9b6ba07-99d5-4468-812f-0721d5f92c88","Type":"ContainerDied","Data":"538eccda58efd0e21d4c7295569bf3e30e84ec6319be9c24ef261b5286f400e7"} Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.499446 4693 scope.go:117] "RemoveContainer" containerID="e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.499532 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-86xln" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.523190 4693 scope.go:117] "RemoveContainer" containerID="130ddb6f329abeb7ac29917c9efbdebe149cbaadaf19b510e59c039a951f275d" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.540166 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-86xln"] Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.552284 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-86xln"] Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.564799 4693 scope.go:117] "RemoveContainer" containerID="5b07291fb1569fc9693ef48e4256cc0912752ae08d7329aaa58addbed95c66d4" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.593130 4693 scope.go:117] "RemoveContainer" containerID="e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e" Nov 25 13:24:38 crc kubenswrapper[4693]: E1125 13:24:38.593718 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e\": container with ID starting with e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e not found: ID does not exist" containerID="e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.593756 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e"} err="failed to get container status \"e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e\": rpc error: code = NotFound desc = could not find container \"e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e\": container with ID starting with e760a37e5aac00e17837ac96a2f54ced515f42630dbedac153c98e72e44d391e not found: ID does not exist" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.593793 4693 scope.go:117] "RemoveContainer" containerID="130ddb6f329abeb7ac29917c9efbdebe149cbaadaf19b510e59c039a951f275d" Nov 25 13:24:38 crc kubenswrapper[4693]: E1125 13:24:38.594171 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"130ddb6f329abeb7ac29917c9efbdebe149cbaadaf19b510e59c039a951f275d\": container with ID starting with 130ddb6f329abeb7ac29917c9efbdebe149cbaadaf19b510e59c039a951f275d not found: ID does not exist" containerID="130ddb6f329abeb7ac29917c9efbdebe149cbaadaf19b510e59c039a951f275d" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.594216 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"130ddb6f329abeb7ac29917c9efbdebe149cbaadaf19b510e59c039a951f275d"} err="failed to get container status \"130ddb6f329abeb7ac29917c9efbdebe149cbaadaf19b510e59c039a951f275d\": rpc error: code = NotFound desc = could not find container \"130ddb6f329abeb7ac29917c9efbdebe149cbaadaf19b510e59c039a951f275d\": container with ID starting with 130ddb6f329abeb7ac29917c9efbdebe149cbaadaf19b510e59c039a951f275d not found: ID does not exist" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.594249 4693 scope.go:117] "RemoveContainer" containerID="5b07291fb1569fc9693ef48e4256cc0912752ae08d7329aaa58addbed95c66d4" Nov 25 13:24:38 crc kubenswrapper[4693]: E1125 13:24:38.594732 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b07291fb1569fc9693ef48e4256cc0912752ae08d7329aaa58addbed95c66d4\": container with ID starting with 5b07291fb1569fc9693ef48e4256cc0912752ae08d7329aaa58addbed95c66d4 not found: ID does not exist" containerID="5b07291fb1569fc9693ef48e4256cc0912752ae08d7329aaa58addbed95c66d4" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.594782 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b07291fb1569fc9693ef48e4256cc0912752ae08d7329aaa58addbed95c66d4"} err="failed to get container status \"5b07291fb1569fc9693ef48e4256cc0912752ae08d7329aaa58addbed95c66d4\": rpc error: code = NotFound desc = could not find container \"5b07291fb1569fc9693ef48e4256cc0912752ae08d7329aaa58addbed95c66d4\": container with ID starting with 5b07291fb1569fc9693ef48e4256cc0912752ae08d7329aaa58addbed95c66d4 not found: ID does not exist" Nov 25 13:24:38 crc kubenswrapper[4693]: I1125 13:24:38.825264 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b6ba07-99d5-4468-812f-0721d5f92c88" path="/var/lib/kubelet/pods/f9b6ba07-99d5-4468-812f-0721d5f92c88/volumes" Nov 25 13:24:49 crc kubenswrapper[4693]: I1125 13:24:49.813039 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:24:49 crc kubenswrapper[4693]: E1125 13:24:49.813847 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:25:01 crc kubenswrapper[4693]: I1125 13:25:01.814026 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:25:01 crc kubenswrapper[4693]: E1125 13:25:01.814782 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:25:16 crc kubenswrapper[4693]: I1125 13:25:16.812942 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:25:16 crc kubenswrapper[4693]: E1125 13:25:16.813681 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:25:20 crc kubenswrapper[4693]: I1125 13:25:20.906385 4693 generic.go:334] "Generic (PLEG): container finished" podID="ff92b9ab-328c-44f4-8d2e-8170223db6c8" containerID="f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde" exitCode=0 Nov 25 13:25:20 crc kubenswrapper[4693]: I1125 13:25:20.906411 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-c7tgm/must-gather-8gx2r" event={"ID":"ff92b9ab-328c-44f4-8d2e-8170223db6c8","Type":"ContainerDied","Data":"f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde"} Nov 25 13:25:20 crc kubenswrapper[4693]: I1125 13:25:20.907595 4693 scope.go:117] "RemoveContainer" containerID="f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde" Nov 25 13:25:21 crc kubenswrapper[4693]: I1125 13:25:21.567293 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c7tgm_must-gather-8gx2r_ff92b9ab-328c-44f4-8d2e-8170223db6c8/gather/0.log" Nov 25 13:25:31 crc kubenswrapper[4693]: I1125 13:25:31.812618 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:25:31 crc kubenswrapper[4693]: E1125 13:25:31.813321 4693 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6d66d_openshift-machine-config-operator(f238a1e7-499b-466f-b643-bef0ae6f5e5f)\"" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" Nov 25 13:25:32 crc kubenswrapper[4693]: I1125 13:25:32.354042 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-c7tgm/must-gather-8gx2r"] Nov 25 13:25:32 crc kubenswrapper[4693]: I1125 13:25:32.354323 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-c7tgm/must-gather-8gx2r" podUID="ff92b9ab-328c-44f4-8d2e-8170223db6c8" containerName="copy" containerID="cri-o://2be25e83eb6d6ab243507f9ec73efe2862fb16d7af599638eb73d179844085dc" gracePeriod=2 Nov 25 13:25:32 crc kubenswrapper[4693]: I1125 13:25:32.375486 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-c7tgm/must-gather-8gx2r"] Nov 25 13:25:32 crc kubenswrapper[4693]: I1125 13:25:32.839773 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c7tgm_must-gather-8gx2r_ff92b9ab-328c-44f4-8d2e-8170223db6c8/copy/0.log" Nov 25 13:25:32 crc kubenswrapper[4693]: I1125 13:25:32.840484 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/must-gather-8gx2r" Nov 25 13:25:32 crc kubenswrapper[4693]: I1125 13:25:32.906119 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfq88\" (UniqueName: \"kubernetes.io/projected/ff92b9ab-328c-44f4-8d2e-8170223db6c8-kube-api-access-qfq88\") pod \"ff92b9ab-328c-44f4-8d2e-8170223db6c8\" (UID: \"ff92b9ab-328c-44f4-8d2e-8170223db6c8\") " Nov 25 13:25:32 crc kubenswrapper[4693]: I1125 13:25:32.906194 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff92b9ab-328c-44f4-8d2e-8170223db6c8-must-gather-output\") pod \"ff92b9ab-328c-44f4-8d2e-8170223db6c8\" (UID: \"ff92b9ab-328c-44f4-8d2e-8170223db6c8\") " Nov 25 13:25:32 crc kubenswrapper[4693]: I1125 13:25:32.911295 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff92b9ab-328c-44f4-8d2e-8170223db6c8-kube-api-access-qfq88" (OuterVolumeSpecName: "kube-api-access-qfq88") pod "ff92b9ab-328c-44f4-8d2e-8170223db6c8" (UID: "ff92b9ab-328c-44f4-8d2e-8170223db6c8"). InnerVolumeSpecName "kube-api-access-qfq88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:25:33 crc kubenswrapper[4693]: I1125 13:25:33.008913 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfq88\" (UniqueName: \"kubernetes.io/projected/ff92b9ab-328c-44f4-8d2e-8170223db6c8-kube-api-access-qfq88\") on node \"crc\" DevicePath \"\"" Nov 25 13:25:33 crc kubenswrapper[4693]: I1125 13:25:33.022306 4693 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-c7tgm_must-gather-8gx2r_ff92b9ab-328c-44f4-8d2e-8170223db6c8/copy/0.log" Nov 25 13:25:33 crc kubenswrapper[4693]: I1125 13:25:33.023697 4693 generic.go:334] "Generic (PLEG): container finished" podID="ff92b9ab-328c-44f4-8d2e-8170223db6c8" containerID="2be25e83eb6d6ab243507f9ec73efe2862fb16d7af599638eb73d179844085dc" exitCode=143 Nov 25 13:25:33 crc kubenswrapper[4693]: I1125 13:25:33.023764 4693 scope.go:117] "RemoveContainer" containerID="2be25e83eb6d6ab243507f9ec73efe2862fb16d7af599638eb73d179844085dc" Nov 25 13:25:33 crc kubenswrapper[4693]: I1125 13:25:33.023780 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-c7tgm/must-gather-8gx2r" Nov 25 13:25:33 crc kubenswrapper[4693]: I1125 13:25:33.046478 4693 scope.go:117] "RemoveContainer" containerID="f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde" Nov 25 13:25:33 crc kubenswrapper[4693]: I1125 13:25:33.087630 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff92b9ab-328c-44f4-8d2e-8170223db6c8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ff92b9ab-328c-44f4-8d2e-8170223db6c8" (UID: "ff92b9ab-328c-44f4-8d2e-8170223db6c8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:25:33 crc kubenswrapper[4693]: I1125 13:25:33.110713 4693 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff92b9ab-328c-44f4-8d2e-8170223db6c8-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 25 13:25:33 crc kubenswrapper[4693]: I1125 13:25:33.125861 4693 scope.go:117] "RemoveContainer" containerID="2be25e83eb6d6ab243507f9ec73efe2862fb16d7af599638eb73d179844085dc" Nov 25 13:25:33 crc kubenswrapper[4693]: E1125 13:25:33.126248 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be25e83eb6d6ab243507f9ec73efe2862fb16d7af599638eb73d179844085dc\": container with ID starting with 2be25e83eb6d6ab243507f9ec73efe2862fb16d7af599638eb73d179844085dc not found: ID does not exist" containerID="2be25e83eb6d6ab243507f9ec73efe2862fb16d7af599638eb73d179844085dc" Nov 25 13:25:33 crc kubenswrapper[4693]: I1125 13:25:33.126283 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be25e83eb6d6ab243507f9ec73efe2862fb16d7af599638eb73d179844085dc"} err="failed to get container status \"2be25e83eb6d6ab243507f9ec73efe2862fb16d7af599638eb73d179844085dc\": rpc error: code = NotFound desc = could not find container \"2be25e83eb6d6ab243507f9ec73efe2862fb16d7af599638eb73d179844085dc\": container with ID starting with 2be25e83eb6d6ab243507f9ec73efe2862fb16d7af599638eb73d179844085dc not found: ID does not exist" Nov 25 13:25:33 crc kubenswrapper[4693]: I1125 13:25:33.126304 4693 scope.go:117] "RemoveContainer" containerID="f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde" Nov 25 13:25:33 crc kubenswrapper[4693]: E1125 13:25:33.126530 4693 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde\": container with ID starting with f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde not found: ID does not exist" containerID="f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde" Nov 25 13:25:33 crc kubenswrapper[4693]: I1125 13:25:33.126556 4693 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde"} err="failed to get container status \"f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde\": rpc error: code = NotFound desc = could not find container \"f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde\": container with ID starting with f0b7396a103a53f541f80ac63c64645d1fc644806bbae98ecbb79f4c30130bde not found: ID does not exist" Nov 25 13:25:34 crc kubenswrapper[4693]: I1125 13:25:34.826942 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff92b9ab-328c-44f4-8d2e-8170223db6c8" path="/var/lib/kubelet/pods/ff92b9ab-328c-44f4-8d2e-8170223db6c8/volumes" Nov 25 13:25:45 crc kubenswrapper[4693]: I1125 13:25:45.813340 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:25:46 crc kubenswrapper[4693]: I1125 13:25:46.148053 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"60649717c0b79e2592d0fc4b7f58f407bbcb0ee6d7fa36fb9ddd168d4c975e27"} Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.345526 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8wwxz"] Nov 25 13:27:23 crc kubenswrapper[4693]: E1125 13:27:23.350106 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b6ba07-99d5-4468-812f-0721d5f92c88" containerName="extract-content" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.350316 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b6ba07-99d5-4468-812f-0721d5f92c88" containerName="extract-content" Nov 25 13:27:23 crc kubenswrapper[4693]: E1125 13:27:23.350556 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b6ba07-99d5-4468-812f-0721d5f92c88" containerName="extract-utilities" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.350682 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b6ba07-99d5-4468-812f-0721d5f92c88" containerName="extract-utilities" Nov 25 13:27:23 crc kubenswrapper[4693]: E1125 13:27:23.350873 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b6ba07-99d5-4468-812f-0721d5f92c88" containerName="registry-server" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.350959 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b6ba07-99d5-4468-812f-0721d5f92c88" containerName="registry-server" Nov 25 13:27:23 crc kubenswrapper[4693]: E1125 13:27:23.351065 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff92b9ab-328c-44f4-8d2e-8170223db6c8" containerName="gather" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.351177 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff92b9ab-328c-44f4-8d2e-8170223db6c8" containerName="gather" Nov 25 13:27:23 crc kubenswrapper[4693]: E1125 13:27:23.351303 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff92b9ab-328c-44f4-8d2e-8170223db6c8" containerName="copy" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.351429 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff92b9ab-328c-44f4-8d2e-8170223db6c8" containerName="copy" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.351770 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff92b9ab-328c-44f4-8d2e-8170223db6c8" containerName="gather" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.351874 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff92b9ab-328c-44f4-8d2e-8170223db6c8" containerName="copy" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.351967 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b6ba07-99d5-4468-812f-0721d5f92c88" containerName="registry-server" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.353954 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.371751 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wwxz"] Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.501598 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/221543b2-28f5-449d-84d6-9fc0e4b8c07f-utilities\") pod \"redhat-marketplace-8wwxz\" (UID: \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\") " pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.501968 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h7z4\" (UniqueName: \"kubernetes.io/projected/221543b2-28f5-449d-84d6-9fc0e4b8c07f-kube-api-access-7h7z4\") pod \"redhat-marketplace-8wwxz\" (UID: \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\") " pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.502182 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/221543b2-28f5-449d-84d6-9fc0e4b8c07f-catalog-content\") pod \"redhat-marketplace-8wwxz\" (UID: \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\") " pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.604518 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/221543b2-28f5-449d-84d6-9fc0e4b8c07f-catalog-content\") pod \"redhat-marketplace-8wwxz\" (UID: \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\") " pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.605126 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/221543b2-28f5-449d-84d6-9fc0e4b8c07f-utilities\") pod \"redhat-marketplace-8wwxz\" (UID: \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\") " pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.605276 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/221543b2-28f5-449d-84d6-9fc0e4b8c07f-catalog-content\") pod \"redhat-marketplace-8wwxz\" (UID: \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\") " pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.605490 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h7z4\" (UniqueName: \"kubernetes.io/projected/221543b2-28f5-449d-84d6-9fc0e4b8c07f-kube-api-access-7h7z4\") pod \"redhat-marketplace-8wwxz\" (UID: \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\") " pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.605508 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/221543b2-28f5-449d-84d6-9fc0e4b8c07f-utilities\") pod \"redhat-marketplace-8wwxz\" (UID: \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\") " pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.629018 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h7z4\" (UniqueName: \"kubernetes.io/projected/221543b2-28f5-449d-84d6-9fc0e4b8c07f-kube-api-access-7h7z4\") pod \"redhat-marketplace-8wwxz\" (UID: \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\") " pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:23 crc kubenswrapper[4693]: I1125 13:27:23.690948 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:24 crc kubenswrapper[4693]: I1125 13:27:24.222391 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wwxz"] Nov 25 13:27:24 crc kubenswrapper[4693]: I1125 13:27:24.584895 4693 generic.go:334] "Generic (PLEG): container finished" podID="221543b2-28f5-449d-84d6-9fc0e4b8c07f" containerID="aae1af65dfeb864e60ddd1032580b088d9d08f65e38e71caccc72f073b70743c" exitCode=0 Nov 25 13:27:24 crc kubenswrapper[4693]: I1125 13:27:24.584940 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wwxz" event={"ID":"221543b2-28f5-449d-84d6-9fc0e4b8c07f","Type":"ContainerDied","Data":"aae1af65dfeb864e60ddd1032580b088d9d08f65e38e71caccc72f073b70743c"} Nov 25 13:27:24 crc kubenswrapper[4693]: I1125 13:27:24.584968 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wwxz" event={"ID":"221543b2-28f5-449d-84d6-9fc0e4b8c07f","Type":"ContainerStarted","Data":"460fd206bf7bdb17e7c8bf3461fb8ee8029a238ec8445b1e9b4807ac88bdc62c"} Nov 25 13:27:24 crc kubenswrapper[4693]: I1125 13:27:24.587070 4693 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 25 13:27:25 crc kubenswrapper[4693]: I1125 13:27:25.595248 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wwxz" event={"ID":"221543b2-28f5-449d-84d6-9fc0e4b8c07f","Type":"ContainerStarted","Data":"c248c2a5fa0136b6f797a01d5703d6aa4a4193972601a9e1146850642f44242e"} Nov 25 13:27:26 crc kubenswrapper[4693]: I1125 13:27:26.607524 4693 generic.go:334] "Generic (PLEG): container finished" podID="221543b2-28f5-449d-84d6-9fc0e4b8c07f" containerID="c248c2a5fa0136b6f797a01d5703d6aa4a4193972601a9e1146850642f44242e" exitCode=0 Nov 25 13:27:26 crc kubenswrapper[4693]: I1125 13:27:26.607580 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wwxz" event={"ID":"221543b2-28f5-449d-84d6-9fc0e4b8c07f","Type":"ContainerDied","Data":"c248c2a5fa0136b6f797a01d5703d6aa4a4193972601a9e1146850642f44242e"} Nov 25 13:27:27 crc kubenswrapper[4693]: I1125 13:27:27.619735 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wwxz" event={"ID":"221543b2-28f5-449d-84d6-9fc0e4b8c07f","Type":"ContainerStarted","Data":"ad85772ed625608f05ffa44c5bae247161d7ddadb169dbd360569b89a8e1c77d"} Nov 25 13:27:27 crc kubenswrapper[4693]: I1125 13:27:27.642200 4693 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8wwxz" podStartSLOduration=2.183506686 podStartE2EDuration="4.642177509s" podCreationTimestamp="2025-11-25 13:27:23 +0000 UTC" firstStartedPulling="2025-11-25 13:27:24.586760192 +0000 UTC m=+4764.504845593" lastFinishedPulling="2025-11-25 13:27:27.045430995 +0000 UTC m=+4766.963516416" observedRunningTime="2025-11-25 13:27:27.638481889 +0000 UTC m=+4767.556567280" watchObservedRunningTime="2025-11-25 13:27:27.642177509 +0000 UTC m=+4767.560262890" Nov 25 13:27:33 crc kubenswrapper[4693]: I1125 13:27:33.691610 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:33 crc kubenswrapper[4693]: I1125 13:27:33.692445 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:33 crc kubenswrapper[4693]: I1125 13:27:33.742500 4693 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:34 crc kubenswrapper[4693]: I1125 13:27:34.722392 4693 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:34 crc kubenswrapper[4693]: I1125 13:27:34.994820 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wwxz"] Nov 25 13:27:36 crc kubenswrapper[4693]: I1125 13:27:36.698474 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8wwxz" podUID="221543b2-28f5-449d-84d6-9fc0e4b8c07f" containerName="registry-server" containerID="cri-o://ad85772ed625608f05ffa44c5bae247161d7ddadb169dbd360569b89a8e1c77d" gracePeriod=2 Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.718770 4693 generic.go:334] "Generic (PLEG): container finished" podID="221543b2-28f5-449d-84d6-9fc0e4b8c07f" containerID="ad85772ed625608f05ffa44c5bae247161d7ddadb169dbd360569b89a8e1c77d" exitCode=0 Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.718822 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wwxz" event={"ID":"221543b2-28f5-449d-84d6-9fc0e4b8c07f","Type":"ContainerDied","Data":"ad85772ed625608f05ffa44c5bae247161d7ddadb169dbd360569b89a8e1c77d"} Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.718856 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wwxz" event={"ID":"221543b2-28f5-449d-84d6-9fc0e4b8c07f","Type":"ContainerDied","Data":"460fd206bf7bdb17e7c8bf3461fb8ee8029a238ec8445b1e9b4807ac88bdc62c"} Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.718871 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="460fd206bf7bdb17e7c8bf3461fb8ee8029a238ec8445b1e9b4807ac88bdc62c" Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.720428 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.844543 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h7z4\" (UniqueName: \"kubernetes.io/projected/221543b2-28f5-449d-84d6-9fc0e4b8c07f-kube-api-access-7h7z4\") pod \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\" (UID: \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\") " Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.844612 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/221543b2-28f5-449d-84d6-9fc0e4b8c07f-utilities\") pod \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\" (UID: \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\") " Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.845829 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/221543b2-28f5-449d-84d6-9fc0e4b8c07f-catalog-content\") pod \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\" (UID: \"221543b2-28f5-449d-84d6-9fc0e4b8c07f\") " Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.845985 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221543b2-28f5-449d-84d6-9fc0e4b8c07f-utilities" (OuterVolumeSpecName: "utilities") pod "221543b2-28f5-449d-84d6-9fc0e4b8c07f" (UID: "221543b2-28f5-449d-84d6-9fc0e4b8c07f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.846520 4693 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/221543b2-28f5-449d-84d6-9fc0e4b8c07f-utilities\") on node \"crc\" DevicePath \"\"" Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.850025 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221543b2-28f5-449d-84d6-9fc0e4b8c07f-kube-api-access-7h7z4" (OuterVolumeSpecName: "kube-api-access-7h7z4") pod "221543b2-28f5-449d-84d6-9fc0e4b8c07f" (UID: "221543b2-28f5-449d-84d6-9fc0e4b8c07f"). InnerVolumeSpecName "kube-api-access-7h7z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.876584 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221543b2-28f5-449d-84d6-9fc0e4b8c07f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "221543b2-28f5-449d-84d6-9fc0e4b8c07f" (UID: "221543b2-28f5-449d-84d6-9fc0e4b8c07f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.949125 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h7z4\" (UniqueName: \"kubernetes.io/projected/221543b2-28f5-449d-84d6-9fc0e4b8c07f-kube-api-access-7h7z4\") on node \"crc\" DevicePath \"\"" Nov 25 13:27:37 crc kubenswrapper[4693]: I1125 13:27:37.949465 4693 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/221543b2-28f5-449d-84d6-9fc0e4b8c07f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 25 13:27:38 crc kubenswrapper[4693]: I1125 13:27:38.728025 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wwxz" Nov 25 13:27:38 crc kubenswrapper[4693]: I1125 13:27:38.762575 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wwxz"] Nov 25 13:27:38 crc kubenswrapper[4693]: I1125 13:27:38.784130 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wwxz"] Nov 25 13:27:38 crc kubenswrapper[4693]: I1125 13:27:38.822889 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="221543b2-28f5-449d-84d6-9fc0e4b8c07f" path="/var/lib/kubelet/pods/221543b2-28f5-449d-84d6-9fc0e4b8c07f/volumes" Nov 25 13:28:05 crc kubenswrapper[4693]: I1125 13:28:05.113774 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:28:05 crc kubenswrapper[4693]: I1125 13:28:05.114340 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:28:35 crc kubenswrapper[4693]: I1125 13:28:35.113907 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:28:35 crc kubenswrapper[4693]: I1125 13:28:35.114553 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:29:05 crc kubenswrapper[4693]: I1125 13:29:05.114133 4693 patch_prober.go:28] interesting pod/machine-config-daemon-6d66d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 25 13:29:05 crc kubenswrapper[4693]: I1125 13:29:05.114667 4693 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 25 13:29:05 crc kubenswrapper[4693]: I1125 13:29:05.114711 4693 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" Nov 25 13:29:05 crc kubenswrapper[4693]: I1125 13:29:05.115454 4693 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60649717c0b79e2592d0fc4b7f58f407bbcb0ee6d7fa36fb9ddd168d4c975e27"} pod="openshift-machine-config-operator/machine-config-daemon-6d66d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 25 13:29:05 crc kubenswrapper[4693]: I1125 13:29:05.115508 4693 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" podUID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerName="machine-config-daemon" containerID="cri-o://60649717c0b79e2592d0fc4b7f58f407bbcb0ee6d7fa36fb9ddd168d4c975e27" gracePeriod=600 Nov 25 13:29:05 crc kubenswrapper[4693]: I1125 13:29:05.613173 4693 generic.go:334] "Generic (PLEG): container finished" podID="f238a1e7-499b-466f-b643-bef0ae6f5e5f" containerID="60649717c0b79e2592d0fc4b7f58f407bbcb0ee6d7fa36fb9ddd168d4c975e27" exitCode=0 Nov 25 13:29:05 crc kubenswrapper[4693]: I1125 13:29:05.613245 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerDied","Data":"60649717c0b79e2592d0fc4b7f58f407bbcb0ee6d7fa36fb9ddd168d4c975e27"} Nov 25 13:29:05 crc kubenswrapper[4693]: I1125 13:29:05.613599 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6d66d" event={"ID":"f238a1e7-499b-466f-b643-bef0ae6f5e5f","Type":"ContainerStarted","Data":"66fb79c8be8a802cb3ea5bfa7aead15218f517818e33ef5d6e9a34d7f6dff684"} Nov 25 13:29:05 crc kubenswrapper[4693]: I1125 13:29:05.613629 4693 scope.go:117] "RemoveContainer" containerID="35582394d59614b0e8db976a3889db107cc7a5ddf6bd98994e17b92b8966aebf" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.179186 4693 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp"] Nov 25 13:30:00 crc kubenswrapper[4693]: E1125 13:30:00.180272 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221543b2-28f5-449d-84d6-9fc0e4b8c07f" containerName="extract-content" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.180292 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="221543b2-28f5-449d-84d6-9fc0e4b8c07f" containerName="extract-content" Nov 25 13:30:00 crc kubenswrapper[4693]: E1125 13:30:00.180323 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221543b2-28f5-449d-84d6-9fc0e4b8c07f" containerName="extract-utilities" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.180333 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="221543b2-28f5-449d-84d6-9fc0e4b8c07f" containerName="extract-utilities" Nov 25 13:30:00 crc kubenswrapper[4693]: E1125 13:30:00.180363 4693 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="221543b2-28f5-449d-84d6-9fc0e4b8c07f" containerName="registry-server" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.180391 4693 state_mem.go:107] "Deleted CPUSet assignment" podUID="221543b2-28f5-449d-84d6-9fc0e4b8c07f" containerName="registry-server" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.180659 4693 memory_manager.go:354] "RemoveStaleState removing state" podUID="221543b2-28f5-449d-84d6-9fc0e4b8c07f" containerName="registry-server" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.181425 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.184044 4693 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.185101 4693 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.216593 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bmqg\" (UniqueName: \"kubernetes.io/projected/b62555bb-2b8a-4524-84ef-06f8100ba1c8-kube-api-access-2bmqg\") pod \"collect-profiles-29401290-pz2dp\" (UID: \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.216895 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b62555bb-2b8a-4524-84ef-06f8100ba1c8-config-volume\") pod \"collect-profiles-29401290-pz2dp\" (UID: \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.217559 4693 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b62555bb-2b8a-4524-84ef-06f8100ba1c8-secret-volume\") pod \"collect-profiles-29401290-pz2dp\" (UID: \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.237659 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp"] Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.324254 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b62555bb-2b8a-4524-84ef-06f8100ba1c8-secret-volume\") pod \"collect-profiles-29401290-pz2dp\" (UID: \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.324389 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bmqg\" (UniqueName: \"kubernetes.io/projected/b62555bb-2b8a-4524-84ef-06f8100ba1c8-kube-api-access-2bmqg\") pod \"collect-profiles-29401290-pz2dp\" (UID: \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.324501 4693 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b62555bb-2b8a-4524-84ef-06f8100ba1c8-config-volume\") pod \"collect-profiles-29401290-pz2dp\" (UID: \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.326166 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b62555bb-2b8a-4524-84ef-06f8100ba1c8-config-volume\") pod \"collect-profiles-29401290-pz2dp\" (UID: \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.334073 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b62555bb-2b8a-4524-84ef-06f8100ba1c8-secret-volume\") pod \"collect-profiles-29401290-pz2dp\" (UID: \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.356267 4693 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bmqg\" (UniqueName: \"kubernetes.io/projected/b62555bb-2b8a-4524-84ef-06f8100ba1c8-kube-api-access-2bmqg\") pod \"collect-profiles-29401290-pz2dp\" (UID: \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.503334 4693 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:00 crc kubenswrapper[4693]: I1125 13:30:00.964917 4693 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp"] Nov 25 13:30:01 crc kubenswrapper[4693]: I1125 13:30:01.152754 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" event={"ID":"b62555bb-2b8a-4524-84ef-06f8100ba1c8","Type":"ContainerStarted","Data":"d25c110db738a380f4a30a120c61ca90f07ea0d0872192ac2b7779236bcd627d"} Nov 25 13:30:02 crc kubenswrapper[4693]: I1125 13:30:02.166682 4693 generic.go:334] "Generic (PLEG): container finished" podID="b62555bb-2b8a-4524-84ef-06f8100ba1c8" containerID="70867c6632f0ed8cc0027ad79df69d5a742f07f068cb7b54bbc2965985ca05b0" exitCode=0 Nov 25 13:30:02 crc kubenswrapper[4693]: I1125 13:30:02.166741 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" event={"ID":"b62555bb-2b8a-4524-84ef-06f8100ba1c8","Type":"ContainerDied","Data":"70867c6632f0ed8cc0027ad79df69d5a742f07f068cb7b54bbc2965985ca05b0"} Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.069342 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.191860 4693 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" event={"ID":"b62555bb-2b8a-4524-84ef-06f8100ba1c8","Type":"ContainerDied","Data":"d25c110db738a380f4a30a120c61ca90f07ea0d0872192ac2b7779236bcd627d"} Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.191911 4693 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d25c110db738a380f4a30a120c61ca90f07ea0d0872192ac2b7779236bcd627d" Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.191977 4693 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29401290-pz2dp" Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.224540 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b62555bb-2b8a-4524-84ef-06f8100ba1c8-secret-volume\") pod \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\" (UID: \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\") " Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.224729 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b62555bb-2b8a-4524-84ef-06f8100ba1c8-config-volume\") pod \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\" (UID: \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\") " Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.224819 4693 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bmqg\" (UniqueName: \"kubernetes.io/projected/b62555bb-2b8a-4524-84ef-06f8100ba1c8-kube-api-access-2bmqg\") pod \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\" (UID: \"b62555bb-2b8a-4524-84ef-06f8100ba1c8\") " Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.227710 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b62555bb-2b8a-4524-84ef-06f8100ba1c8-config-volume" (OuterVolumeSpecName: "config-volume") pod "b62555bb-2b8a-4524-84ef-06f8100ba1c8" (UID: "b62555bb-2b8a-4524-84ef-06f8100ba1c8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.231251 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b62555bb-2b8a-4524-84ef-06f8100ba1c8-kube-api-access-2bmqg" (OuterVolumeSpecName: "kube-api-access-2bmqg") pod "b62555bb-2b8a-4524-84ef-06f8100ba1c8" (UID: "b62555bb-2b8a-4524-84ef-06f8100ba1c8"). InnerVolumeSpecName "kube-api-access-2bmqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.231457 4693 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b62555bb-2b8a-4524-84ef-06f8100ba1c8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b62555bb-2b8a-4524-84ef-06f8100ba1c8" (UID: "b62555bb-2b8a-4524-84ef-06f8100ba1c8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.326601 4693 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b62555bb-2b8a-4524-84ef-06f8100ba1c8-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.326636 4693 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b62555bb-2b8a-4524-84ef-06f8100ba1c8-config-volume\") on node \"crc\" DevicePath \"\"" Nov 25 13:30:04 crc kubenswrapper[4693]: I1125 13:30:04.326648 4693 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bmqg\" (UniqueName: \"kubernetes.io/projected/b62555bb-2b8a-4524-84ef-06f8100ba1c8-kube-api-access-2bmqg\") on node \"crc\" DevicePath \"\"" Nov 25 13:30:05 crc kubenswrapper[4693]: I1125 13:30:05.150547 4693 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f"] Nov 25 13:30:05 crc kubenswrapper[4693]: I1125 13:30:05.160112 4693 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29401245-nsf4f"] Nov 25 13:30:06 crc kubenswrapper[4693]: I1125 13:30:06.823643 4693 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb9e0867-f6aa-43ca-b148-a2850b65ab16" path="/var/lib/kubelet/pods/cb9e0867-f6aa-43ca-b148-a2850b65ab16/volumes"